Learn about the roots of the modern computing with the BBC’s free archive
The Computer Literacy Project archive offers full programs and pre-made clips for those with less time on their hands.
From an early ’80s look at 20th-century automation, to the ’90s perspective on the growth of computers in society, the BBC’s archive of several hundred classic documentaries is now free for anyone to access. Including interviews with visionary pioneers of the time like Bill Gates and Steve Wozniak, it’s hoped that the retro programming can help inspire future innovators and teach people young and old about where some of their most used modern technology came from.
The BBC has been a major component in driving digital innovation for decades, not only in its programming, but in encouraging home hardware hackers with the release of the BBC Micro in 1981 and more recently, the ARM-powered Micro Bit in 2016. The purpose of these programs was to encourage computer literacy among school-age children and to foster an interest in programming and electronics. The release of the archives of the BBC’s “Computer Literacy Project,” is another step on that road.
A total of 267 programs make up the archive and are freely available to stream online. Programs were originally broadcast between 1980 and 1992 and run the gamut from interviews with industry leaders, to insights into specific hardware like the BBC’s own Micro computer.
“This archive offers a fascinating and nostalgic glimpse into an important milestone in the history of computing,” said Matthew Postgate, the BBC’s chief technology and product officer, via BBC News. “The hardware may have changed, but the principles still apply — which also makes it a unique resource for teaching and learning that will hopefully encourage a new generation of computer users.”
To give modern viewers a better look at some of the software that was used on that particular platform, the new archive also allows visitors to run a number of BBC Micro programs within their browser. They are, understandably, basic by modern standards, but they include inputs to progress graphical interfaces, a breakdown of the chip manufacturing process from the time, and an outline of basic encryption algorithms. There are more than 160 in total to play around with and they are a great example of just how far we have come in the past four decades of computing.
- 64-bit is 32-bit’s newer, hipper sibling. Here’s why it dominates modern computing
- You may never own a quantum computer, but IBM will still let you use one
- Google makes it even easier to get into A.I. with Raspberry Pi bundles
- To put a quantum computer on your desk, Intel has a plan unlike any other
- These Raspberry Pi 3 bundles will cover everyone, from coders to gamers