Jump to content

Talk:Signal

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Firefly322 (talk | contribs) at 04:14, 22 June 2007 (Recommend the Usage of Standard Signal and System Text). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconTelecommunications Unassessed
WikiProject iconThis article is within the scope of WikiProject Telecommunications, a collaborative effort to improve the coverage of Telecommunications on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
???This article has not yet received a rating on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.

As I understand it, a signal needs a temporal component (modulation in time). But the Fourier transform of that signal loses the temporal component. Thus I do not see how that FT(signal) can also be called a signal, which is why I renamed it signature, which describes a characteristic of said signal. Ancheta Wis 02:01, 27 Feb 2004 (UTC)

On Nov. 30, 2005, I edited the article extensively, to correct the following problems: a) mistatement of the notion of a signal in information theory-the notion is not that it is a flow of information but rather that it is a sequence of states of a communications channel at the transmitter. b) confusion of "data" with "signal"- Some of the problems are: a) analog and digital are properties of data NOT signals. b) it is possible to perform frequency analysis on data but is not possible to perform it on a signal c) a signal is not a sequence of numbers but a sequence of numbers can be properly referenced as "data." As the article is about "signal" and not about "data," the best way to handle this situation seems to be to edit out the references to "signals" that are really "data." c) The claim that entropy is a property of a signal or set of signals is incorrect. Entropy is a property of a communications channel. As the article is about the concept of a "signal," it seems best to leave out discussion of entropy.

One of the properties of the "signal" that is defined in information theory is that the process that generates it it is a stochastic process. As this is sometimes a source of confusion, I added a discussion of this topic. Terry Oldberg http://www.oldberg.biz terry@oldberg.biz

Hours after I contributed an edit or the article plus the above justification for it, "Shanel" wiped it out without supplying a justification. Shanel: please either attempt a justification of your action or restore the text as edited by me.
Terry Oldberg Dec. 1, 2005

Terry, what other characteristics of signals ar there besides the stochastic attribute of the signal-generating process? And why is that one so important? What about the channel's characteristics and influence? Your first paragraph is useful because it helps people distinguish a signal from a waveform and a message. Many people have the concept of a signal as something radiating from the tip of an antennae, i.e. radar and radio and many people would probably relate to that. Many people discover signals through their interest in digital audio/video. People coming from different backgrounds should find something they can relate to in the article. Rtdrury 20:56, 3 December 2005 (UTC)[reply]

RTdrury: The stochastic attribute is important because it implies a constraint on the signal-generating process, for not all processes are stochastic. I'm sensitive to this problem as a result of having once attempted statistical research in the field of nondestructive testing. Through misuse of the word "signal," workers in the field implied that a process was stochastic when it was not. In doing so, they implied that this process obeyed conventional statistics when this was not true. One of the results of the misuse was (and still is) to expose the people of the world to unnecessary hazards from such events as explosions of nuclear reactors and downings of aircraft. You are quite right in implying that the channel characteristics and influence are important. However, it seems to me that it would help untangle Wikipedia if they were to be discussed in a separate article on the notion of a communications channel. After all, a signal and a communications channel are two, different entities.
Terry Oldberg 07:25, 15 December 2005 (UTC)[reply]

I came onto this article looking for a place to Wiki-link the term signal in the article Optical communications. The old article seems to have roughly the definition of signal that is used in that field, that is, a time-varying quantity of interest, regardless of how it was produced. But, as Terry points out, thats not the correct definition for the field of information theory. I'd like to move the old article to a new title like "signal (circuit theory)" or "signal (circuits and systems)" or "signal (communications)", but I'm not sure of the best title --- anyone feeling a little more bold want to just pick? -- The Photon 01:54, 4 December 2005 (UTC)[reply]

I'm not clear on the distinction you draw between signals and data. For instance, I gather that you're trying to distinguish a digital data stream from the continuous electrical quantities that represent it, but it would be good if you could explain what exactly you mean. Furthermore, as the author of this article, I have to say that everything I've been taught in two semesters of signals theory supports the definitions I used.
Finally, as far as wiki etiquette is concerned, if you ever see the need to delete a large amount of text from an article, please replace it somewhere else. For now, I'm going to put the text that you've removed back into the article, with a note saying that you contest its accuracy. --Smack (talk) 03:40, 14 December 2005 (UTC)[reply]
Smack, are you certain that the definition of "signal" is the same in information theory and in signals theory? I don't know if signals theory is the same as what we called "signal analysis" in my university, but if it is, that field uses very different tools from information theory, and I'm not surprised if information theory has a very specific definition of "signal", and its not what you'd expect if you aren't an information theory (since very little in information theory is what you'd expect from lay knowledge of the terms they use). Again let me suggest moving the current article to Signal analysis or Signal (circuits and systems), and let the information theory people have an article that correctly relates to information theory.
The Photon 05:11, 14 December 2005 (UTC)[reply]
Smack: That there is controversy on the wording of the article asks whether there is an important distinction between a number and the representation of a number in telecommunications hardware; the latter is a sequence of states of the communications channel. If this distinction is preserved, there is, for example, a difference between 01100011 and the sequence of states of a communications channel that represents this number during transmission. The former is a sequence of digits. The latter may sequence of voltages across conductors. Should we blur the distinction between the former and the latter?
In the defining paper of information theory, "A Mathematical Theory of Communications," Shannon distinguishes between the two. Numbers are a subset of the "characters" which, in sequence, make up what Shannon calls a "message." The sequence of states of the communications channel, at the transmitter end of it, make up what Shannon calls the "signal." Shannon's "message" is what one now calls the "data."
Shannon's "received signal" differs from his "signal" through the entry into the communications channel of noise but the received message may, through the use of an error correcting code, be identical to the transmitted message. His "signal" may be continuous in time when his "message" is discrete in time, or vice versa. These are some of the differences between Shannon's "signal" and his "message."
In view of the above facts, I submit that it is essential for Wikipedia's article on "signal (information theory)" to preserve the distinction between a number and the representation of a number as a sequence of states of a physical system. When this distinction is preserved, a "signal" is a sequence of states, at the transmitter. A number or sequence of numbers is a subset of what Shannon calls a "message" and one now calls "data." The "signal" encodes the "message."
Terry Oldberg 07:25, 15 December 2005 (UTC)[reply]
Terry, You inspired me to look up Shannon's paper (here). Shannon does mention both messages and signals which occur in discrete time, and he mentions both continuously valued and quantized signals as well (his example is a telegram, where the signal is composed of "dots, dashses, and spaces"). This seems to make much of the discussion about digital and analog signals as well as discrete-time and continuous-time signals relevant to information theory. Is there a different preferred terminology in information theory that you could put in place of "analog", "discrete", etc., instead of removing these sections entirely?
Part I of Shannon's paper is titled "the discrete noiseless system", in which the transmitter, channel, and receiver are all noise-free. The bulk of Part I is a discussion of the statistical nature of the message: "the messages to be transmitted consist of sequences of letters...they form sentences and have the statistical structure of, say, English." It looks as if Shannon's formulation calls for the message to be generated by a stochasitic process (or a process which, since we can't predict it, we must model as a stochastic process). For example, he generates a number of articifial sentences, with increasingly sophisticated models of English. Sections 2, 3, 4, 5, 6, and 7 of Part I discuss the stochastic nature of the source of the message. The transmitted signal has a stochastic nature only because it is generated (noiselessly) from the stochastic message.
Shannon's Theorem 7 relates the entropy of the transmitted signal to the entropy of the source, or of the message. That Shannon would calculate the entropy of a signal seems to conflict with the statement that "The claim that entropy is a property of a signal or set of signals is incorrect." Shannon's paper refers to the entropy of both message sources and signals, and the capacity of the channel.
In Part II Shannon uses the term received signal, indicating that the term signal is not only associated with the output of the transmitter and the input to the channel. This seems to conflict with the definition of a signal as "the sequence of states of a communications channel that encodes a message, at the transmitter end of the channel" [emphasis added].
I'm not yet clear on whether frequency analysis has a place here. I hadn't yet dug it out of Shannon's paper, but I've just noticed in Part III there is some discussion of channels characterized by their frequency and impulse response --- I'll withold any opinion until I've dug in further to that part of the paper.
To wrap up, going by Shannon's paper, most of the current version of the article does seem to be relevant to information theory. Signals may be either discrete-time or continuous, and they may be either continuously valued or quantized, but perhaps information theory uses different words for these concepts. Signals may occur at either end of a channel, and they are characterized by an associated entropy. There is still some room for improvement in the article to clarify the difference between sources, messages, signals, and channels; and the terminology might not be exactly right for information theory. Nonetheless the bulk of the material in the article should be cleaned up, not eliminated.
The Photon 07:03, 16 December 2005 (UTC)[reply]
It looks like this is the time for me to step aside from this issue. What's the difference between information theory and signals theory? I wanted to name this article "Signal (signals theory)", but that would have been a circular definition. I also don't like the proposed qualifier "circuits and systems", because signal theory transcends electrical engineering. --Smack (talk) 19:45, 21 December 2005 (UTC)[reply]
Have a look at these questions in their most primitive form: a single bit of information is represented by a change in state, from one thing to another thing. So to measure anything, you need always to compare two things (before+after, upper/lower bound, hotter/colder etfc.). This change may be discrete or continuous, and is called a signal. -- Waveguy 22:37, 21 December 2005 (UTC)[reply]

Need a broader definition of Signal

The current message-oriented definition is way too narrow. Articles such as Spectral density need to reference signal, but there's no appropriate definition. Should we make yet another signal page for the broader definition? Or just broaden this one?

At the risk of opening up a can of worms, I'm going to attempt a new broad intro. Dicklyon 01:58, 3 June 2006 (UTC)[reply]

I agree with you. Also, I think the title should be "Signal (electrical engineering)". I think there is no need to have a separate article for "Signal (information theory)" (we can have as a small section of this article). BorzouBarzegar 17:59, 7 June 2006 (UTC)[reply]
I agree it would be nice to have something to link to the word signal, but on the other hand Wikipedia is not a dictionary. Despite all the fuss I went to above over this article, I now think the real best answer is to define the term signal in the appropriate articles, such as Information theory, Telecommunications, Signal processing, etc. This article is either just a dictionary definition, or a second-rate rehash of what ought to be in those other articles.
That said, if you're going to keep this article, but broaden its sense to cover meanings of the word signal outside of information theory, then the name should definitely be changed. -- The Photon 04:40, 8 June 2006 (UTC)[reply]
The concept of Signal in Electrical Engineering definitely needs an article. BorzouBarzegar 15:56, 8 June 2006 (UTC)[reply]
If you mean we need distinct pages, what definitions would you use that would be different between information theory and electrical engineering. In all my training, I never found such a distinction. Or are you just supporting renaming this one? Dicklyon 17:44, 8 June 2006 (UTC)[reply]
I'm just supporting renaming this one to "Signal (electrical engineering)". BorzouBarzegar 19:57, 8 June 2006 (UTC)[reply]
It seems that there isn't any objection to change the title. So, I move the page. BorzouBarzegar 20:32, 8 June 2006 (UTC)[reply]
I think it should be moved back to Signal (information theory), or maybe something else entirely. As the "examples" section of the page points out, "signal" is something that is used by biology, physics, etc. A continuous signal can be sent from point A to point B without an electrical circuit ever being involved. Neurons are the clearest example, but I'm sure there are (and will be) other examples. --Interiot 20:09, 21 May 2007 (UTC)[reply]
That's true, but electrical engineering is the field in which signals are mostly studied, so it works as it is. Information theory talks about signals, but has a rather narrower view of them than is used in EE and other fields, where they are quite often treated outside of an information framework. Dicklyon 03:02, 22 May 2007 (UTC)[reply]

Proposed merger

I propose merging Signal processing into this article because

-- The Photon 04:24, 10 June 2006 (UTC)[reply]

I think they're distinct enough content areas to keep separated. The signals article talks about different types and classifications of signals, how the term is used in different fields (some of which, like information theory, have someswhat different usess than the broad signal processing field), and stuff like that. The signal processing article is more about techniques and application areas. I think merging them would be messy. Dicklyon 17:57, 10 June 2006 (UTC)[reply]
The Signal processing article doesn't talk about any of that. Everything that's currently in that article would fit comfortably in this one.
Digital signal processing does discuss the techniques and applications, and Analog signal processing could but its also just a stub. So my proposal is to give the top level overview in Signal (electrical engineering), and put applications and techniques information into the more specific "Digital ..." and "Analog ..." articles. From your input, I might change my above reasoning (2nd bullet) to say, "if Signal processing were developed into a PERFECT article, it would overlap almost entirely with either Signal (electrical engineering) or Digital signal processing or Analog signal processing."
There's a sideline or tangential issue that Analog signal processing is not a common term (at least in my experience), and Filter theory or Filter or Analog electronics would be better titles for that article.
-- The Photon 05:21, 11 June 2006 (UTC)[reply]
I agree that there's a lot of overlap in the topics covered. However, signal processing is such a basic subject... I just don't think it would look right if someone were to search for signal processing and be redirected to some other article. Wouldn't it be better if the final article were called Signal processing, rather than Signal (electrical engineering)? Don't you think the former is a more common search term? --Zvika 17:01, 12 June 2006 (UTC)[reply]
If I was sole editor of a traditional encyclopedia, I'd probably rather do it that way. There's so many different ways to define the word signal that each subject area should just treat the topic within its own article. But for Wikipedia, there's a couple of reasons to do it the other way around:
  • Editors of other articles will want to be able to make a link to Signal more often than to Signal processing. If that article doesn't exist, someone will re-create it.
  • It's clear (maybe just to me) that signal processing is a subtopic of signals, but not obvious that signals are a subtopic of signal processing. It's possible to have signals without signal processing, but not the other way around.
But, I could go either way on this. -- The Photon 02:40, 13 June 2006 (UTC)[reply]

I prefer having separate articles. This article should be about the basic concept of the signal and its types. "Signal processing" should be about the field and its branchs and basic techniques. I don't think that there will be much overlap. Even if we merge them now, we will eventually need to split them. BorzouBarzegar 13:12, 13 June 2006 (UTC)[reply]

Correction

Under Analog and Digital Signals -> Discretization, it says,

"DT signals often arise via of CT signals."

Should this just say, "via"?

--208.188.2.93 18:20, 18 July 2006 (UTC)[reply]

Yes, fix it. Dicklyon 18:34, 18 July 2006 (UTC)[reply]

Recommend the Usage of Standard Signal and System Text

Regarding the comment from 2004 at the top of this page, in ABET accredited ELE/ECE junior-level signal and systems courses, a periodic time-domain signal is defined as a type of vector. Its vector components can be determined by way of the fourier series. The notion of a vector carries over into non-periodic signals where the fourier series is generalized to the fourier transform. A highly respectable reference to this train of thought can be found in Signal Processing and Linear Systems by B.P. Lathi. I suggest the addition of these and similar structural ideas from B.P. Lathi's text or another standard ELE/ECE text such as Continous and Discrete Signal and System Analysis by George R. Cooper and Clare D. McGillem. --Firefly322 03:33, 22 June 2007 (UTC)[reply]

Certainly describing signals as vectors can be worthwhile. Limiting to signals that are periodic, or that have Fourier transforms, however, is pretty limiting, as it leaves out for example the signals that are stationary random processes. And lots of decompositions besides Fourier ones are important. So if you go there, try not to make it too narrowing. Dicklyon 03:41, 22 June 2007 (UTC)[reply]
I do also agree that relying on a good solid source like a text book would lead to improvements. And if I don't like what your book says, I'll be motivated to go find alternatives in other books. Too often we argue here just because we're too lazy to find sources. Dicklyon 03:45, 22 June 2007 (UTC)[reply]
Okay... Now I realize it's not often taught in this way, but a process, random or otherwise, is in fact just a sequence of events. A good example is a Markov Chain where each probability matrix in the chain can be viewed as a single hyper-number conceptually similar to other hyper-numbers (e.g., a quaternion or an octonion). Anyway, my ultimate perspective is that random processes have components that map to the elements of vectors/matrixes. And I'm just wondering if you are suggesting something else? --Firefly322 03:59, 22 June 2007 (UTC)[reply]