Here’s a terrific vintage Apple II ad – one that might inspire some readers of this site to go eBay an Apple II for their next VJ gig. (Hey, it has analog video out built in, something a modern Apple … doesn’t.)

Of course, what’s telling is that when this ad was made, in 1977, buying an Apple II was not about getting an ecosystem of pre-built software. Most of the examples assume you’re coding what you’re seeing yourself. Sure enough, pick up a vintage computer manual, and it more or less skips into coding right away, typically in BASIC. (These things seem ubiquitous at flea markets and garage sales.) DIY code might sound intimidating, but the programming examples very often are simple enough that a novice or even a kid could get going right away. (It is, to be fair, sometimes unclear what the purpose of printing text to a screen in loops would be; before we wax nostalgic, coding today really is a lot more fun and productive than it was three decades ago.)

It’s not so much the programming language used, or the fact that the Apple II when introduced just didn’t have that much software. What you see in the ad is a vision in which the whole family, kids included, can be expressive with code in a way they might not with off-the-shelf software – either in 1977 or 2012. That’s why it’s so nice to see things like Processing on the iPad. Forget about the tablet versus PC, iPad versus desktop battles. Computers have always played a tug of war between consumption and creation, fun and productivity, entertainment and art. When they’re at their best is when you really do begin to see those boundaries melt away, into something that is the expressive “bicycle for the mind” Jobs once described.

In a tech-phobic 1977, truthfully, computers were in the hands of a few, and this vision of programming your own graphics was likely alien from most people’s reality. In 2012, it’s another story. Computers, whether in a phone or a desktop, are spreading to the whole planet’s population. And the idea of groovy computer graphics seems less some weird technologic thing (hence the strange vocoded voiceover), and more a natural form of self-expression.

In other words, this needn’t inspire only nostalgia. It could be an idea that was simply before its time.

  • http://www.apl2bits.net/ Ken Gagne

    I like the symmetry: that in 1977, "this vision of programming your own graphics was likely alien from most people’s reality" compared with today, when so many users are walled off from programming their Macs, they wouldn’t know where to begin.  It’s so similar, yet so different, from where we started.

    • http://pkirn.com/ Peter Kirn

      That’s a matter of perspective, though, yes? You can drop down to a command line and do things the old-fashioned way. As for Xcode, “walled off” isn’t how I’d describe it so much as “facing a massive, complex system.” But that’s why mucking about with the CLI – or using a graphical environment that’s more direct, like Processing – has such appeal.

    • http://blogs.computerworld.com/gagne/ Ken Gagne

      @peterkirn:disqus , I agree that the Mac is still accessible and programmable to those who want it to be so.  But the necessity is much more different: nowadays, you can be a user without being a programmer.

    • http://pkirn.com/ Peter Kirn

      Absolutely! The irony, though, to me is that even though the Apple II needed you to be a programmer for many tasks, many people … just weren’t. And for those who were, a lot of what they did barely exceeded GOTO 20 PRINT “HELLO.” (Seriously.) I think it’s funny, actually, that we get morose about programming today. Now, programming *education* in schools I think has gotten substantially worse – no more LOGO in every classroom, replaced with travesties like PowerPoint. But programming *literacy* seems substantially better. And the many people you talk to who do code are reaching really quite high levels of competence.

      If you could put those two things together – the LOGO of yore with the, frankly, more enjoyable programming experience of today (even on the CLI) – imagine what would be possible.

  • Vixmedia

    Mine is in my carport — bought in 1983 to run the Alpha-Syntauri keyboard system.

  • Bgg

    I wouldn’t say that programming today is necessarily more interesting or rewarding, particularly if you are interested in generative graphics.  If you were say very serious about programming a c64, you would eventually map the entire 6510 processor chip in your mind.  Writing assembler getting messy with the hardware is far more interesting to certain types than writing with high level graphics libraries.
    There is a reason why the C64 demo scene is still where the most impressive work is done, and what is the point of writing PC demos that depend on 3d graphics co processors etc?  Might as well just work in video/animation.

    • http://pkirn.com/ Peter Kirn

      Right, except you can still code that way today if you like. Then, you didn’t have a choice. And… really, the work is, by extension, less impressive outside the C64 demo scene? I mean, I love that scene, but that’d be a pretty strong statement. Care to defend that idea? 

  • Bgg

    Oh yeah, everything is more accessible, and true one can still go the old way for kicks.  And not only that but there are so many microprocessors, fpga, etc etc that one can use for personal projects. 

    What’s impressive when it comes to generative graphics programing?  1) a good looking final production 2) hardware hacking / programming chops.  PC demos / high level gpu coding can satisfy 1.  But not so much 2.  C++ SDKs may be fast and powerful, but they block out the freakiness.  The modern PC/Mac solution is to to keep the programmer in a box and make code ‘safe’ but that’s hardly interesting for satisfying part 2.

    The ‘build a faster processor’ and use safer code method of improving performance is just not terribly interesting in comparison.  And if everyone is not producing on the same spec equipment there is no way to judge 2).   So what is the point of doing fully generative computer graphics on a PC in a ‘demo’ sense?  Sure you can do a lot more with your 40k, but it’s just because you’ve offloaded more onto the gpu and common libraries. If you’re going to take a technological ‘arms ware’ approach, one might as well leave the pretense of showing off coding chops and wizardry and just use animation / video editing software.  It’ll take you to 1 faster.