Eisspeedway

Electronic media

A screenshot of a web page. The computers to store, transmit, and display the web page are electronic media. The web page is an electronic medium.
Graphical representations of electrical audio data. Electronic media uses either analog (red) or digital (blue) signal processing.

Electronic media are media that use electronics or electromechanical means for the audience to access the content.[1] This is in contrast to static media (mainly print media), which today are most often created digitally, but do not require electronics to be accessed by the end user in the printed form. The primary electronic media sources familiar to the general public are video recordings, audio recordings, multimedia presentations, slide presentations, CD-ROM and online content. Most new media are in the form of digital media. However, electronic media may be in either analogue electronics data or digital electronic data format.

Although the term is usually associated with content recorded on a storage medium, recordings are not required for live broadcasting and online networking.

Any equipment used in the electronic communication process (e.g. television, radio, telephone, game console, handheld device) may also be considered electronic media.

History of Development

Transmission

Wire and transmission lines emerged as communication tools, starting with the telegraph in the late 18th century. Samuel Morse invented the telegraph in 1832, introducing wires to transmit electrical signals over long distances. In 1844, the first successful telegraph line was established in the United States, and in the 1850s, telegraph cables were laid across the Atlantic connecting North America and Europe.[2] At the same time the telegraph was becoming mainstream, the need to transmit images over wire emerged. The first commercially successful fax machine was developed by Elisha Gray in 1861, allowing printed images to be transmitted over a wire.[3]

The telephone was another breakthrough in electronic communication, allowing people to communicate using voice rather than written messages. Alexander Graham Bell pioneered the first successful telephone transmission in 1876, and by the 1890s, telephone lines were being laid worldwide.[4] Since all these significant breakthroughs relied on transmission lines for communication, a minor improvement was made by the English engineer Oliver Heaviside who patented the coaxial cable in 1880.[5] The coaxial cable allowed for greater bandwidth and longer transmission distances.

Significant improvements in the mode of transmission were made in the last seventy years with the introduction of fiber optics, wireless transmission, satellite transmission, Free Space Optics, and the internet. Fiber optics was first developed in the 1950s but became commercially viable in the 1970s. On the other hand, wireless communication made a major improvement in the transmission mode, doing away with wires and introducing electromagnetic waves. Guglielmo Marconi invented the radio transmission in 1897, and by the 1900s, radio transmission had become a mainstream source of news, entertainment, and military communication.[6] Satellite communication allowed data to be transmitted over much longer distances than possible. The United States pioneered satellite communication in 1958 when it first launched Explorer 1.[7]

Free Space Optics (FSO), which uses lasers to transmit data through the air, was first developed in the 1960s. However, it was only in the 1990s that the technology advanced enough to become commercially viable.[8] The internet, on the other hand, emerged in the second half of the last century. In the 1960s, the first protocols for transferring files were developed, making it possible to transfer files between computers. In 1989, Tim Berners Lee created the World Wide Web, making it much easier to share information through hyperlinks. In 1996, the Real-Time Transport Protocol (RTP) was introduced, allowing for live audio and video streaming over the internet. RTP was a breakthrough in online entertainment, allowing real-time events to be broadcast live to audiences worldwide.

Display and output

The history of display and output technology is long and fascinating, beginning in the early 19th century with the development of the galvanometer, which was used to detect and measure small electrical currents. In 1844, the telegraph sounder was developed, which used an electromagnet to produce a clicking sound that corresponded to the transmission of electrical signals over a telegraph line.[9] It was followed by the telephone receiver, which used a diaphragm to convert electrical signals into sound. In the late 1800s and early 1900s, the first forms of artificial light were developed, including red light and neon. These were used in various applications, including lighting for displays and signs.

In 1910, the teleprinter was invented, which allowed for the transmission of text messages over a wire. It was followed by the development of the cathode-ray tube (CRT) by William Crookes, but it became widely available by the 1920s. The CRT was used for early television and computer displays.[10] The radio and television tuner was also developed in the early 20th century, allowing people to receive and tune in to broadcast signals. The speaker and headphones were invented in the late 1800s and early 1900s and were used for listening to audio signals from radios, phonographs, and, later, electronic devices.

In the 1950s and 1960s, LED and LCDs were developed, allowing for the production of more compact and efficient displays for various applications such as lighting and television monitors.[11] In the 1970s, laser light shows were introduced, which used lasers to produce dramatic visual effects for concerts and other events. The first computer monitor was developed in the 1950s, and the first commercial PC monitor was introduced in 1976. Large electronic displays were introduced in 1985, allowing for the production of large-scale displays for use in stadiums, arenas, and other public spaces. HDTV was first proposed as a term in 1936, but it was in the 1990s that standards were established for producing and broadcasting high-definition television signals.[10] The head-mounted display (HMD) was introduced in 1968 and continues to be developed and improved to this day, allowing for immersive virtual reality experiences and other applications.

Electric signal processing

The history of electrical signal processing is closely tied to the development of electronic communications technology, beginning in the mid-18th century with the invention of the capacitor, which allowed for the capture and storage of electrical charges. In the 1830s, analog encoding methods, such as Morse code, were developed, allowing for transmitting information over long distances using electrical signals.[2] Electronic modulation was developed between 1832 and 1927 and was a crucial development in the history of telecommunications.

Electronic multiplexing, which allowed for the transmission of multiple signals over a single channel, was first developed in 1853 using a technique called time-division multiplexing (TDM).[12] Digitizing, or converting analog signals into digital form, was first developed in 1903 with the invention of pulse-code modulation (PCM) for telephone communications.[13] Electronic encryption, which allowed for the secure transmission of information over electronic channels, was developed between 1935 and 1945 and played a crucial role in developing electronic communications during World War II. Online routing, or the ability to direct electronic signals to specific destinations, was first developed in 1969 with the creation of the ARPANET, a precursor to the modern internet.[14] Electronic programming, or the ability to use electronic signals to control and automate processes, has been developed since the 1940s and continues to be an important area of research and development in electrical signal processing.

Electronic information storage

The history of electronic information storage dates back to the 18th century, with the invention of punched cards and paper tape in 1725 and 1846, respectively. Early forms of electronic storage were used to store simple text and numerical data.[15] In the late 19th century, the invention of the phonograph cylinder and disk in 1857 and 1877, respectively, allowed for the recording and storage of audio data. In 1876, the invention of film allowed for the recording and storing of moving images.[15]

In 1941, the invention of random-access memory (RAM) allowed for storing and retrieving digital data at high speeds and is still in use today.[15] Barcodes were first invented in 1952 for use in grocery stores. The Universal Product Code (UPC) was standardized in 1973, allowing for storing and retrieving product information in a digital format.[16]

In 1969, the invention of laser discs allowed for the storage and playback of high-quality video and audio data, but the format was short-lived, with its commercial life ending in 1978. Compact discs (CDs) were invented in 1982 and quickly became a popular medium for storing and playing back digital audio data.[15] DVDs were introduced in 1993, offering higher storage capacity and the ability to store video data.

Content formats

Content or media refers to the different types of digital information that can be stored, transmitted, and consumed through electronic devices. The history of content formats dates back to the late 19th century when the first audio recording was created.

  • Audio Recording: In 1877, Thomas Edison invented the phonograph, the machine that could record and playback audio.[17] The invention began audio recording and created different audio formats, including vinyl records, magnetic tape, and digital audio files. Vinyl records were introduced in the late 19th century and were the primary format for music until the late 20th century when digital audio formats such as MP3 and AAC were introduced. The vinyl, however, remains a cultural icon despite its obsolescence, and it retains an aura of sanctity immune to symbolic pollution.[18] The MP3, on the other hand, was invented by Karlheinz Brandenburg from the Fraunhofer Institute.[19] The MP3 encoder software developed by the institute enabled individuals to digitize their audio files using a compression algorithm called MPEGI-Layer III.[19] Compressed files were stored in CDs, at the time costing $250.
  • Video Recording: Video recording technology was first introduced in 1952 when Charles Ginsburg created the first video tape recorder at Ampex Corporation. The videotape recorder technology was later refined, introducing different video formats, such as Betamax, VHS, and DVD. Betamax was released by Sony in 1975 and was an analog video recorder of the cassette format. However, it did not survive for long. The Video Home System (VHS), developed by the Victor Company of Japan in 1976, won the video format wars trumping Betamax and gaining widespread usage worldwide.[20] The main difference between VHS and Betamax was that Betamax had sharper and clearer images while VHS had a longer run time.[20] In the 21st century, however, these two are now part of the bygone eras. Digital video formats such as MPEG-4 and H.264 have become the dominant video recording and playback formats.
  • Digital File Formats: The introduction of digital file formats marked a significant shift in how content was stored and transmitted. In the early days of digital computing, text-based formats such as ASCII and RTF were used to store and transmit textual content. ASCII was based on telegraphic code and had a very narrow scope of use, considering it only had 128 code points.[21] RTF, on the other hand, stands for Rich Text Format. Microsoft developed it in 1987. The format allowed for sharing of documents across platforms. RTF was especially valued because it could store important document information such as formatting, font, and style. Later, image formats such as JPEG and PNG were introduced, allowing for the storage and transmission of digital images. The introduction of digital audio and video formats further expanded the range of digital file formats available.
  • Database Content and Formats: Databases have been used to store and manage digital content since the 1960s. E.F. Codd conceptualized the relational database model in 1970.[citation needed] The model required applications to search data without following links and instead search through the content. The model rested on the predicate and set theory and would set the stage for future databases.[citation needed] The first commercially available database management system (DBMS) was introduced in 1979 by Relational Software, Inc. (later renamed Oracle Corporation). Since then, different database systems have been introduced, including relational, object-oriented, and NoSQL databases.

Interactivity

Interactivity refers to the ability of electronic media to respond to user input, allowing for a more immersive and engaging experience. The history of interactivity can be traced back to the development of input devices such as the control panel.

  • Control Panel: Control panels were first introduced in the early days of computing to interact with computer systems. These panels typically consisted of a series of switches and knobs that could be used to input data and commands into the computer.
  • Input Device: The development of input devices such as the keyboard and mouse marked a significant advancement in the field of interactivity. The introduction of graphical user interfaces (GUIs) allowed for more intuitive interaction with computer systems.
  • Game Controller: The game controller was introduced in the late 1970s with the introduction of the Atari 2600 video game console. The Atari 2600 console had no disk space and only had a RAM of 128 bytes.[22] Its graphics clock ran 12 MHz, while its ROM only had 4 kilobytes.[22] Despite such limitations, the device allowed users to interact with video games more immersively, paving the way for developing more advanced game controllers in the future.
  • Handheld: The introduction of handheld devices such as the Nintendo Game Boy and the Sony PlayStation Portable allowed for interactive gaming on the go. Nintendo was released in 1989 in Japan, and it was criticized for not having any backlight or graphics. The first Sony PlayStation was also released in Japan in 1994. The initial devices were handheld video games; however, more sophisticated video game consoles such as PlayStation 3, 4, and 5 have also been released. The initial handheld devices featured built-in controllers and small screens, allowing users to play games anywhere and anytime.
  • Wired Glove: The wired glove was first introduced in the early 1980s to interact with virtual reality environments. These gloves were equipped with sensors to detect hand movements, allowing users to manipulate virtual objects and navigate virtual environments. Wired gloves significantly improved from using the mouse, joystick, or trackball during virtual interactions. They were expensive, limiting their spread and expansion.[23]
  • Brain-Computer Interface (BCI): The brain-computer interface (BCI) is the latest development in interactivity. The technology allows users to control electronic devices using their brainwaves, bypassing the need for physical input devices such as keyboards or controllers. While still in the experimental stage, BCI technology has the potential to revolutionize the way we interact.

Electronic Media Breakdown

References

  1. ^ Medoff, Norman J.; Kaye, Barbara (2013-03-20). Electronic Media: Then, Now, and Later. Taylor & Francis. ISBN 978-1-136-03041-3.
  2. ^ a b Winston, Brian (2002). Media,Technology and Society: A History: From the Telegraph to the Internet. doi:10.4324/9780203024379. ISBN 9780203024379.
  3. ^ Evenson, A. Edward (2000). The telephone patent conspiracy of 1876 : the Elisha Gray-Alexander Bell controversy and its many players. Jefferson, North Carolina. ISBN 0-7864-0883-9. OCLC 45128997.{{cite book}}: CS1 maint: location missing publisher (link)
  4. ^ Grosvenor, Edwin S. (2016). Alexander Graham Bell. Morgan Wesson. Newbury. ISBN 978-1-61230-984-2. OCLC 1031985337.{{cite book}}: CS1 maint: location missing publisher (link)
  5. ^ Watson-Watt, Robert (1950). "Oliver Heaviside: 1850–1925". The Scientific Monthly. 71 (6): 353–358. Bibcode:1950SciMo..71..353W. ISSN 0096-3771. JSTOR 20196.
  6. ^ "Wireless Telegraphic Communication" (PDF).
  7. ^ Pratt. Satellite communications.
  8. ^ Malik, Aditi; Singh, Preeti (2015-11-17). "Free Space Optics: Current Applications and Future Challenges". International Journal of Optics. 2015: e945483. doi:10.1155/2015/945483. ISSN 1687-9384.
  9. ^ Stephens, C.E. (March 1989). "The impact of the telegraph on public time in the United States, 1844-93". IEEE Technology and Society Magazine. 8 (1): 4–10. doi:10.1109/44.17681. ISSN 1937-416X. S2CID 38619959.
  10. ^ a b Sterling, Christopher; Kittross, John Michael (2001). Stay Tuned: A History of American Broadcasting. doi:10.4324/9781410604064. ISBN 9781135685119.
  11. ^ Kawamoto, H. (April 2002). "The history of liquid-crystal displays". Proceedings of the IEEE. 90 (4): 460–500. doi:10.1109/JPROC.2002.1002521. ISSN 1558-2256.
  12. ^ Cranch, Geoffrey A.; Nash, Philip J. (2001-05-01). "Large-Scale Multiplexing of Interferometric Fiber-Optic Sensors Using TDM and DWDM". Journal of Lightwave Technology. 19 (5): 687. Bibcode:2001JLwT...19..687C. doi:10.1109/50.923482.
  13. ^ Black, H. S.; Edson, J. O. (January 1947). "Pulse code modulation". Transactions of the American Institute of Electrical Engineers. 66 (1): 895–899. doi:10.1109/T-AIEE.1947.5059525. ISSN 2330-9431. S2CID 51644190.
  14. ^ Hauben, Michael. "History of ARPANET - Behind the Net - The untold history of the ARPANET Or - The "Open" History of the ARPANET/Internet" (PDF). jbcoco.com.
  15. ^ a b c d Stockwell, Foster (2001). A history of information storage and retrieval. Jefferson, N.C.: McFarland. ISBN 0-7864-0840-5. OCLC 44693995.
  16. ^ Savir, D., & Laurer, G. J. (1975). The characteristics and decodability of the Universal Product Code symbol. IBM Systems Journal, 14(1), 16-34.
  17. ^ Edison, Thomas A. (1878). "The Phonograph and Its Future". The North American Review. 126 (262): 527–536. Bibcode:1878Natur..18..116.. doi:10.1038/018116g0. ISSN 0029-2397. JSTOR 25110210. S2CID 4125335.
  18. ^ Bartmanski, Dominik; Woodward, Ian (2018-03-04). "Vinyl record: a cultural icon". Consumption Markets & Culture. 21 (2): 171–177. doi:10.1080/10253866.2016.1212709. ISSN 1025-3866. S2CID 151336690.
  19. ^ a b Denegri-Knott, Janice; Tadajewski, Mark (2010-01-01). "The emergence of MP3 technology". Journal of Historical Research in Marketing. 2 (4): 397–425. doi:10.1108/17557501011092466. ISSN 1755-750X.
  20. ^ a b https://www.academia.edu/download/9905503/0262072904intro1.pdf [dead link]
  21. ^ Hieronymus, James L (1994). "ASCII Phonetic Symbols for the Worlds Languages" (PDF). Worldbet. S2CID 16144522. Archived (PDF) from the original on 2023-03-31. Retrieved 27 April 2023.
  22. ^ a b Wolf, Mark J. P. (2013). "Abstraction in the Video Game". In Wolf, Mark J.P.; Perron, Bernard (eds.). The Video Game Theory Reader. doi:10.4324/9780203700457. ISBN 9781135205195. Retrieved 2023-03-09.
  23. ^ Ortega-Carrillo, Hernando; Martínez-Mirón, Erika (2008-10-27). "Wired gloves for every one". Proceedings of the 2008 ACM symposium on Virtual reality software and technology. VRST '08. New York, NY, USA: Association for Computing Machinery. pp. 305–306. doi:10.1145/1450579.1450665. ISBN 978-1-59593-951-7. S2CID 17206527.

Further reading