Introducing Z Vector (posted July 9, 2013)


About 1.5 years ago, around the same time that we’d launched the public beta for NI mate (our first software), I ran into two musicians who had a band and were going to the same place I was – South by Southwest in Austin, Texas. I heard one song from them that I liked in particular and came up with an idea – we could make a depth sensor based music video for it.

From my Vimeo account: “We filmed Scars in about four hours at our company recording studio in Helsinki, Finland on a snowy evening in February 2012. The band had no clue what to expect: Hanna had high fever and zero make-up on while Tommi came in jeans and a hoodie. All we had was a single sensor, a laptop, a totally experimental piece of real-time graphics producing software, some carnival masks I’d brought from Venice many years earlier and a vague idea of what we’d set out do. Trying all kinds of crazy moves in front of the screen and Kinect, we recorded a total of about 50 minutes of screen captured video synched to the music, which was then delivered to Hannu for editing and post FX. Just a few evenings and iterations later (with poor Hannu running 11 parallel tracks of footage) we had Scars.”

In total we spent just 60 hours on the project, but Janne had come in with a software that would eventually become Z Vector. The first version of the software had a very basic, but very usable, sound animated user interface – that worked in real-time. The visuals, especially for something put together in a whim, looked astonishingly cool. The video eventually found its audience and we ran with the idea. Now, after 1.5 years of hard work – and literally hundreds of hours of field testing, we’re proud to bring you Z Vector. We also thank you for taking part in our beta. We sincerely hope you have as many inspirational and fun moments with this software as we have had in this past year.

PS. Never try to write a blog post at the end of your seven day crunch to get a piece of software out. :wink: