explore visionOS development with Tony from 1984

Going to Cupertino for Vision Pro Developer Labs

Description of Apple's Vision Pro Developer Labs

Go to the source

There are a bunch of roadblocks when trying to develop spatial computing software without a spatial computer. How quickly does the Vision Pro recognize surrounding surfaces and make the reconstructed scene available for usage? How far away does the Vision Pro recognize surfaces? What types of experiences are best suited for the Vision Pro given its weight, heat generation, and materials? Does my ARKit code even work?

I don't know.

So I'm going to Cupertino tomorrow for the Vision Pro Developer Labs. It's a bit of a trek from Chicago – but it should be fun.

I won't be able to share many details about the labs or the device, but I'm eager to get my grubby hands on the hardware and try out my early visionOS version of Float.

In the summer of 1996 – when the N64 had already been released in Japan but hadn't made it to the US just yet – there was a semi-shady strip mall video game store in the northwest suburbs of Chicago with an imported, Japanese N64. The shop owner put the N64 in a booth in the back of the store and enclosed the booth with a black curtain. For $5 you could play (Japanese) Mario 64 for fifteen minutes or so, starting each session from a brand new file. It was wonderful.

This feels a bit like that.