![]() | This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||
|
|
The explanation is bogus and does not make sense, please correct it or insert citation that explains it in the current way.
The deffect explained here sounds like TV's strobing
—Preceding unsigned comment added by 92.87.192.98 (talk) 16:57, 16 February 2009 (UTC)
Strobing? We are not talking about jerkiness here, and the image hardly represents what you call "strobing". Eugene2x-talk 23:15, 5 March 2009 (UTC)
"The concept of breaking a single video frame into interlaced 'fields' was first demonstrated by Russian inventor Léon Theremin in 1927" (Albert Glinsky: Theramin, University of Illinois Press, 2000)
First of all, I do not doubt that Theremin invented his electronic instrument "Theramin" and other things in the 1920s. These are proven facts.
But with all respect, I very much doubt this whole Theremin / television inventor / interlaced inventor stuff. This sounds like one of those classic Russian/Soviet falsifications to me.
The whole story is just based on a single source, which is the book mentioned above - published in 2000. The other source would be Theremin's own few lines about this topic in the book "A.F. Joffe - Memories [my own translation], Academy of Sciences Press [again my own translation], Moscow, 1973", offering a phantastic story how he invented television devices with a few lines up to 62 and even 120 lines (incorporating interlaced 'lines' technique) within 2 or 3 (!) years (from 1924 or 1925 to 1927) as part of his academic thesis! Moreover, he was travelling a lot during that period to present his 'Theramin', to negotiate lincence issues for his 'Theramin', etc. His alleged development of television of course had to be temporary halted during his extensive travels in 1925/26. He also claims to have invented in 1927 a portable(!) camera or whole television system with 100 lines, which could operate outdoor and under daylight conditions! (Meaning without any additional light source - in 1927!!!)
Should these claims be true, Theremin would have been ways ahead of all other television pioneers. I think - and 'early television buffs' would agree - about one decade ahead wouldn't be exaggerated in this case. The story of early television would have to be rewritten.
The stupid thing, now, about all of these phantastic 'achievements' regarding television devices is that there is not a single proof for not even one of these claims. No photos, no patent files, no drawings, no working schemes, no technical descriptions, nothing detailed and nothing general, no presentations (he travelled a lot and had his own laboratory in the US during the early 1930s), no contemporary articles. Absolutely nothing. Zero.
However, Theremin himself claims that there had been an article in the magazine Ogonyok [my own translation] in the 1920s. But even if that would be true, it wouldn't change anything about the non-existing proofs: Theremin maybe indeed has written a (theoretic) thesis about television and maybe did some research. But if one remembers the tons of propaganda, which had been put out especially by the early Soviet Union, to show how 'progressive' and 'modern' the largely backwards country was, it is not unlikely that Ogonyok somehow 'sexed up' its article a little bit.
There is some literature about Theremin from (communist) East Germany, a soviet puppet state, which glorified everything Soviet/Russian. Theremin had been a big celebrity in East Germany due to his various achievements. But in none of this literature, if at all, I could discover anything new about his alleged achievements in the field of television. It's always the same few statements, which I tried to give above in my own words.
So I have good reason to assume that this whole 'Theremin-television story' is nothing more than a huge fake!
I can only hope that it's only Russian chauvinists that spread such allegations via the 'University of Illinois Press' and all over corresponding Wikipedia articles.
I for my part will erase those non-proven statements about Theremin and television.
Greets. —Preceding unsigned comment added by 84.157.69.85 (talk) 19:34, 5 August 2009 (UTC)
How can PAL, NTSC, VGA, SCART switch between progressive (like 240p) and interlaced scan. Is it just timing? Just a single sentence with a reference would suffice (and you know the correct section). It would add hard facts to the article and we could remove some of the advantage/disadvantage stuff. -- Arnero (talk) 17:00, 8 March 2010 (UTC)
I am here in response to a request posted on the NPOV noticeboard.
The question was about the sentence, "Interlace is a technique of improving the picture quality of a video signal without consuming extra bandwidth", which currently is the first sentence of the article. It is my opinion that this is not NPOV. It would be better to say something like, "Interlace is a technique of displaying video." Then subsequently, its quality could be compared to other video display techniques (without saying one is improved over the other) and its bandwidth use could likewise be compared. Thoughts? Blue Rasberry 17:06, 25 April 2010 (UTC)
Interlacing was adopted as a means of increasing the number of scan lines of early TV systems, and thus the vertical resolution, while staying within a 6 MHz TV channel. The field rate of 60 Hz gave interlaced video an effective frame rate of 30 fps, exceeding the 24 fps frame rate of motion-picture film, and allowed TV equipment to synchronize to the AC power-line frequency of 60 Hz. Both persistence of vision (Phi phenomenon) and phosphor persistence reduce apparent flicker. This technology was developed by RCA in the 1930's in and around New York city (NBC) and RCA headquarters in Camden, New Jersey. Interlacing was a way of improving picture quality by increasing the number of scan lines without exceeding the allocated RF spectrum.
I think that the original statement is uncontroversially true...
Can it be then backed up by a source/citation? --Xerces8 (talk) 13:15, 9 May 2010 (UTC)
If you have a problem with the word "improving", say "doubling the apparent vertical resolution". — Preceding unsigned comment added by 172.91.176.10 (talk) 05:05, 3 September 2018 (UTC)
Page moved to Interlaced video. Vegaswikian (talk) 19:33, 14 November 2010 (UTC)
Interlace → Interlace (video) — Relisted. There may be a consensus forming for a different name in the discussion, but if so we need some clear indications of support. Vegaswikian (talk) 22:56, 7 November 2010 (UTC) Especially now the video technique is gradually falling from use, it does not meet the requirement of WP:PRIMARY, & the plain term should go straight to disambiguation. Johnbod (talk) 20:25, 31 October 2010 (UTC)
Adding tags to talk pages is fun and easy. However, not every article belongs in every project. Adding a very low relevance project to an article's talk page wil have no effect in improving the article and will just annoy people concerned about the project's backlog. This article is pretty important to the subject of "television" but just because they watch interlaced broadcast video in Paraguay doesn't mean it should be added to the Paraguay project. --Wtshymanski (talk) 15:04, 28 February 2011 (UTC)
From the description, he developed a scanning television method. But where does interlace come into it? Did his mirror drum already produce interlaced pictures? -- megA (talk) 12:28, 23 June 2011 (UTC)
Just wondering about the future of interlacing. Initially interlacing was designed for fast refresh rate CRT use but CRTs are no longer produced so interlacing could disappear altogether.
Does anybody know if OLED refresh rates are up to the job? I don't understand why they should stop using interlacing along with all the other digital compression mechanisms to use less bandwidth if future displays could have refresh rates up to the job. I read that oled could be up to the task with little quality loss even compared to a CRT. Should somebody mention this in the future of interlacing? — Preceding unsigned comment added by 86.27.131.165 (talk) 10:24, 14 October 2014 (UTC)
I think the first line should be corrected as: «Interlaced video is a technique for doubling the perceived vertical resolution of a video display without consuming extra bandwidth».Senbei64 (talk) 11:35, 24 September 2015 (UTC)
When the HDTV standards were first established in the U.S. in the 1990's, broadcasters had a choice between vertical resolutions of 1080 interlaced or 720 progressive. Digital compression was in its infancy and MPEG-2 was chosen as the compression standard (and remains so as of this writing). In the U.S. there was (and still is) a hard limit of 6 MHz of RF spectrum for each TV channel, which will carry 19.39 Mbps of digital data. Interlace was preserved as a legacy of analog broadcasting. These resolutions were initially offered as a way of delivering 16 x 9 high-definition pictures.
20 years later (as of this writing), digital compression has come a long way, with broadcasters using a portion of their 19.39 Mbps for their main programming service, along with digital subchannels often carrying vintage standard-definition programs and movies. — Preceding unsigned comment added by 172.91.176.10 (talk) 07:54, 18 June 2019 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on Interlaced video. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template ((source check))
(last update: 18 January 2022).
Cheers.—InternetArchiveBot (Report bug) 20:04, 14 November 2017 (UTC)
Theremin had nothing to do with interlacing. The source used in the article for the claim that Theremin has invented interlacing can be found as a Google Book here: [1], and it makes no mention of Theremin having anything to do with interlacing at all, much less having invented it. --2003:71:4E16:4B83:4CBE:CCA1:12C7:4742 (talk) 04:40, 14 January 2018 (UTC)
The Description Section of this article includes the following text:
"Format identifiers like 576i 50 and 720p 50 specify the frame rate for progressive scan formats, but for interlaced formats they typically specify the field rate (which is twice the frame rate). This can lead to confusion, because industry-standard SMPTE timecode formats always deal with frame rate, not field rate. To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats, e.g., 480i 60 is 480i/30, 576i 50 is 576i/25, and 1080i 50 is 1080i/25. This convention assumes that one complete frame in an interlaced signal consists of two fields in sequence."
This text contains several issues: 1. "576i 50 and 720p 50 specify the frame rate for progressive scan formats": 576i50 would be interlaced not progressive; 2. "To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats": I believe this is untrue. I believe this statement is true with respect to the EBU, however if you refer to the SMPTE UHD-SDI Standards Roadmap (see: https://www.smpte.org/sites/default/files/images/SMPTE%20wallchart%232.6_20_17-JULY%202017.pdf), you can see that the SMPTE nomenclature 1080i50 and 1080i60 actually refers to 25 and 30 frames per second respectively.
It is my understanding that a lot of confusion arises due to the fact the the EBU uses frame rates, while SMPTE uses field rates for interlaced designations, while both EBU and SMPTE use frame rates for progressive designations.
Scj242 (talk) 07:27, 22 October 2018 (UTC)
Edit: Further supporting references can be found in the various SMPTE standards documents, such as SMPTE 274M, SMPTE 296M (e.g. 274M, Table 1, System 6, "1920 × 1080/50/2:1, 25 [Frame rate] 2:1 interlace").
Scj242 (talk) 20:01, 22 October 2018 (UTC)
Progressive is a relatively new term used for non-interlacing, yet is used here to the extent that it says that "progressive scan" was reintroduced in the 1970s. This is incorrect. Non-interlacing was reintroduced. It wasn't called progressive scan until the first HD televisions. Unless I'm seriously mistaken, monitors were previously advertised as non-interlaced, not progressive. Because of this, I wondered for years when TVs would go non-interlaced. Thetrellan (talk) 17:59, 6 October 2019 (UTC)
the IBM 8514 should be mentioned as one of the fiew interlaced computer display standards. What do you think? --RokerHRO (talk) 08:32, 17 October 2020 (UTC)