All posts by Kenny Bunch

Understanding the evolution of the Apple TV.

It has been a few weeks since the marketing machine we all know as Apple announced their plans for the TV. Given that space of time, I think that people can now look at the announcement a little bit more objectively and try to make true judgement calls on it. Everyone is asking, what can we expect from this product?

I think to truly have insight into any product announcement, you have to ask a few other questions first:

  • Is this what they set out to build?
  • What are their future plans?

If you answer these questions, you can begin to answer the larger question of what to expect.

As someone who builds his business around platforms like Apple’s, my mind has been guessing and evaluating the possibilities for quite some time. With the new announcement, we now have a tangible picture of what they’ve done for the Apple TV and can draw some conclusions about where it’s going. Since I have pondered the what ifs a lot longer than most, I’m going to give you a little dive into how my brain has deconstructed it all and you can take from that what you will. A forewarning, I think about this a lot. I have a lot to say, but I think it will help you draw valuable insight into Apple’s new product and how they operate.

So let’s jump into our first question, “Is this what Apple set out to build?”. It is important to note, products typically don’t pop up overnight. They are planned well in advance and evolve over time. Inside discoveries and outside market forces tend to shape the direction. I like to think that Apple has product visions that stretch over multiple years rather than operating on quarterly reactions. Other forces effect it, but I believe more than other companies they have very strategic efforts that they carry out. Let’s take a trip back in Apple’s timeline to see what I mean.

Here are some key dates that we can use to potentially understand Apple’s TV initiatives.


Each of the dates above play a role in my mind to the evolution of the product. Some were initial inspirations and others were planned initiatives around the TV itself. If you look at what happened on each of the dates and focus on how they relate to video, the timeline starts to tell a story.


The iPod and iTunes

There have been many influences over the years, but what can we point back to as the original spark? I’m going to pick the iPod and iTunes. It is common knowledge that iTunes and the iPod changed Apple and the entire music industry forever. Apple hit a sector of the entertainment industry that was falling apart and desperately needed a solution. Apple offered it, and in turn became the the main broker of music entertainment. This introduced a completely new outlet to their business, selling content. This obviously opened their eyes and filled their pocket books. I don’t think many people saw it coming, not even them. It was the beginning of a shift in their thinking.


Media Centers and the Mac Mini

Over the course of the next few years, Apple really prospered. Obviously anyone at the helm of a ship like that starts asking what other things can we apply this winning formula to. It seems like a no brainer, TV shows and movies right? They are consumable media just like music. At the time of iTunes, video content wasn’t very digital yet, but people were exploring its potential. In 2002 Microsoft introduced a Media Center Edition of XP that was a play into the space. Their OS sought to provide access to movies, TV, pictures, music and more in a “leanback” experience. This really got the conversation going about a computer for the living room that brought all media in digitally.

Fast-foward to January 22, 2005, and Apple introduces the Mac mini, an extremely compact version of their desktop computer. At the time, the market was saturated with monster computer towers that took up enormous amounts of space and had 50 different components thrown into them. This was a great move by Apple on many levels. It started to remove a lot of excess in desktops, reducing both cost and size. At the time, people had become accustomed to desktop beasts they were forced to live with, so it didn’t innovate that market overnight. It was good for it, but it wasn’t a game changer. However, if you look at it from a different perspective you can see a bit more genius in it. As people began exploring “media centers” in their living rooms, the giant towers weren’t accepted there. Consumers weren’t going to be OK with putting some massive brick beside their TV. The Mac mini was the perfect form factor for that room. If you compare it to the modern Apple TV, the two look almost the exactly same. I’m sure a lot of the components, tooling, and manufacturing capabilities used to create the mini were able to be leveraged to produce the TV.

At this time, I believe Apple had a vision of getting into the living room, but they weren’t sure what it was going to look like yet. They knew video content still hadn’t made a huge shift to digital, but it was gaining a lot more traction than ever before. Flash video had been introduced in 2003 and was updated in 2005 to actually be more viable. People were starting to explore the potential of this technology, both large media companies and startups. The biggest player to the movement, YouTube, went live on February 14, of that same year. This time period provided the biggest spark for digital video movement. It started to make it more viable.


Front Row and Purchasable video content

Apple was reading all the signs, and the mini provided them a vessel to explore with early adopters. They were playing it safe, because they still needed to determine consumer behaviors around the space before they jumped into it. It was still evolving. Unlike Microsoft and Google who are quick to get their ideas out, Apple tries to make sure they don’t make a play at something until they feel the play is solid. They do explore, but are calculated in their explorations. Jumping to October of that same year, you can see this. Apple announced new software Front Row and introduced TV show purchases to iTunes. For those not aware of what Front Row was, it could be seen as Apple’s first “media center”. When announced, their computers started shipping with small remotes. When used with the Front Row software, it turned your desktop computer into a leanback experience. You were able to browse photos, listen to music, and watch videos. The introduction of TV show purchases to iTunes, along with movies the next year in September of 2006, marked their desire to bring the same model they had applied to music to video. Even if it was an exploration play, it showed Apple had interest in the living room.


Apple TV (v1)

The original Front Row, remote, and iTunes purchases probably got very small usage, but I believe this taught Apple a lot. It illustrated, that the living room was a different experience. Consumers weren’t going to just move their desktops into that room. They weren’t going to spend a fortune for that experience. They didn’t need a powerful beast to just consume things. They just needed the simple things that front row provided on a less expensive device. It could be very focused and lean. Fine tuning their software and hardware, on January 9, 2007, Apple brought out their answer, the original Apple TV. With its launch and for years to come, they also made a very strategic note that it was a pet project for them, an experiment. They knew the market wasn’t completely there yet and they didn’t want the product to be viewed as a failure. This type of marketing definitely buffered them from getting criticism over the years as their other products had more stellar sucess stories. I also truly believe that they understood how complex the TV and movie business was. Unlike the music industry, the TV and movie business wasn’t in dire distress. The relationships and how it was run made for a huge hurdle as well. It has many layers, which I’ve mentioned before. Apple wasn’t going to be able to take that market as easily as they had with music. Knowing it was eventually going to happen, their approach was genius.


The iPhone distraction

The next 8 years, I consider years full of learning and distraction. The distraction came from Apple’s second modern blockbuster, the iPhone released on June 29, 2007. In a world full of people looking at their smartphones every 5 minutes, it is hard to imagine a world that existed without them. Being in the industry, I remember the promise for years of how insane the market could be. The number of people who owned phones vs computers, the rate at which they acquired new devices, everything showed promise. It wasn’t until the delivery of the iPhone that anything ever delivered on it.

Personally, my guess is Jobs was more interested in the space of personal computing than small handhelds. He was a creator. I think what we saw the iPhone become, he originally had planned for the laptop/tablet. In a way the phone was a distraction for him from creating a device to create. It was a device to communicate and consume. I tend to wonder if it went against his grain a bit, given I feel he desired one on one communication.

If you back up  to September 7, 2005, 2 years prior to the iPhone launch, Apple announced an attempt with Motorola to put iTunes on one of their phones. It made total sense, given the closeness in form factor and the ability to have the iPod morph into a connected device. I see the Motorola venture as another experiment. Apple was looking for what that experience might be. They knew the potential for smartphones, but had to figure out how to approach the industry first. I think the experiment showed them how underserved the market was. It probably also ate at Jobs up that it was so bad.


The App Store

A year passed after the launch of the phone, and in July 2008, Apple announced the App Store on the phone. This was a monumental game changer for computers in many ways. Up to this point, the software industry was either shrink wrap or purchased direct from the creator digitally. The phone proposed interesting issues of “how do you get apps on it?” and “how do you ensure they don’t wreak havoc on the device?” Since iTunes and music were the original motivating factors to move to the phone, Apple took a play from its own book. They created the App Store to be iTunes for apps. Not only were they able to sandbox the things being deployed to the phone, they became the curators and brokers of what went on the device. It wasn’t a new concept, but it was the first time someone was able to pull it off. With the ability to have an instant market for what you create, this brought developers in swarms to the platform. It transformed the device into something entirely new. It became a brain and game console in your pocket.

Even more so than the iPod, I don’t know that Apple saw how big this would become. Their company focus shifted again. The phone was their golden goose and would hold the company’s main focus for years. Everything that hadn’t been fully developed got pushed down in priority.



The one thing that didn’t lose priority was the device that I believe Jobs originally wanted to create, the iPad. In April 3, 2010, the first version was released. In watching Jobs present it, I feel that it was very evident, this was one device he had envisioned for a long time. In many ways, the iPad was just a big iPhone. Some people mocked it as such and couldn’t see the need for it. What they weren’t seeing was that it lended its self to consuming content that wasn’t as suitable for the phone, from both a connection and form factor perspective. One of those was video. If you are going to be in a place that has wifi, sitting instead of standing, would you rather watch a movie on iPhone or iPad? The larger form factor also gives you more room for control. It filled a gap that the phone and laptops didn’t.


Purchase vs Rental

As I mentioned, while the iPhone revolution was going on, Apple was still learning. The TV took a backseat, but I think it needed to. The market was still too complicated. Apple originally approached it with a purchasing model, applying the same logic to video content as they had to music. The problem with that approach was people don’t consume TV shows and movies in the same way they do music. Typically unless you have kids and you just throw a movie in and hit repeat, you aren’t going to watch the same thing over and over again. With music, you do. Coming from a behavior of purchasing albums and having the desire to listen to the content over and over again, people were willing to buy. They wanted the ability to own their content and take it with them. The iPhone and apps like Spotify and Pandora have since changed this behavior, but coming off of CD sales and Napster, it was what people expected.

If you look at the video market, it was dominated by rentals, VHS and later DVDs. DVD sales existed, but it wasn’t the dominate choice. Apple later realized this and 3 years after introducing purchasing, announced rentals on January 15, 2008. The market for the TV was still very niche though.


AirPlay and one device to rule them all

With the massive success of the iPhone and iPad, I bet Apple’s thinking shifted. Apple started asking the question, what if the iPhone is the center of everything? If I have access to all my content thru that device and I carry it with me everywhere, why not make it be the brain? September 1, 2010, Apple started with another experiment: AirPlay. AirPlay is the ability to mirror or broadcast content from one device (iPhone, iPad, or Mac computer) to another device like the Apple TV. This is a powerful concept and requires that you really only have one smart device and anything it broadcasts to merely needs to be a receiver. Bill Gates outline it in his book “The Road Ahead”, written in 1995. I explain this point because many people don’t know what it is or that it even exists. That is part of its issue, it requires multiple unified elements and an understanding of how they all tie together.

I’ve longed for this concept to become true. In presentations I gave back in 2005, I talked about the iPod becoming just that. Even back then, it was evident that it could become a handheld super computer. Google made a go at this too with Chromecast a simple TV receiver/dongle priced under 40 dollars that introduced on July 24, 2013. More recently there have been attempts on Android and Microsoft to take the extra step and make the device change its UI based on the context of the receiver. Microsoft labels it Continuum. It will come, but the complexity of that market will take time to evolve.


The complexity of the TV market

AirPlay illustrated Apple was trying to look at different approaches to the TV market. In addition, I think they were trying to figure out a much harder issue, how to enter it from a content perspective. Apple tried both purchase and rental models with content. Traditional media companies were fine with this approach since it was more of a secondary market. What I think Apple came to realize is that consumption behavior for video is all about first run and being the original distributor. This made their job much harder.

Like I mentioned in the Apple TV (v1) section, there have been huge barriers if you were going after the premium content tier (movies, tv shows). We know from 2014 FCC Filings by Comcast and Time Warner that Apple had approach them about jointly developing a set-top box. The relationships with content providers and MSOs (AKA multiple-system operators like Comcast, DirectTV, etc) is so hard to break, I believe Apple realized that a strong play would be to work directly with the MSOs. The problem with that is that the MSOs knew they still had a good thing and saw what happened to the music industry. I think they may have strung Apple along. Why let them in if you are doing well and your wall is high?


TV starts to break down

If you read my article “What is happening to the entertainment industry?” , you’ll see that the walls that were once so high have broken down. It didn’t happen overnight like the music industry, but the entertainment industry is slowly entering into an initial state of distress. One of the largest catalyst to this movement is Netflix. They originally copied the cable companies subscription model and applied it to DVDs, then transitioned that success to online distribution in 2007. One of the keys was convenience and paying a small monthly fee. Consumers were used to subscriptions, it was expected and welcomed. The cost played a huge role too. In comparison to DVD rentals or cable subscriptions it was more approachable. Netflix could arguably be labeled the first online cable company and you can even watch Netflix for other countries using VPN services as vpn netflix gratis which are great for this.


Whats the cheapest way to get Netflix on TV?

As Netflix became more popular online, people began to have the desire to watch it on their TV. This fueled what I like to call the “how can I get Netflix on my TV” phase. Netflix was incredibly smart about trying to get it on anything and everything that could land them on the TV and in your living room. Whether it was a smart TV, Xbox, Apple TV, or some random device like a Boxee the public started to seek out the cheapest way to get Netflix on their TV.

In the spring of 2014, both Google and Amazon introduced versions of their OS for the TV. However, I think the real tide turned in November 19, 2014 when Amazon introduced a small dongle for the TV known as the Fire Stick that ran their TV based OS. The kicker was they sold it for only $19 originally for Prime members. From a cost perspective, it made this a no brainer. It was so cheap, why would you not buy one? What was surprising was the device and OS were actually nice. It entered in for most people as the cheap way to get Netflix (and Prime) on the TV, but the man behind the curtain knew it had an app store and was running Android. If you jump forward to today, a lot of content companies like HBO, Showtime and others have applications running on the device.


Is the new Apple TV what Apple wanted?

Finally, let’s talk about the new Apple TV (v2) that was announced on September 9, 2015. I think we can look at all of the points mentioned in this article and see how this product evolved. We know Apple has had an interest in the TV space all along, they knew the promise of it. I think they have been cautious of how they approach it both from a device and content perspective. They experimented with different approaches, but I think very early on realized they need a relationship with content providers in the same way that MSOs had. I believe they also realized that because the industry was relatively stable that it was going to be hard to accomplish that. Their attempts to get an insider advantage by working directly with MSOs failed. However, the industry has also taken a significant turn. Netflix and Amazon have started to illustrate there are approaches that work.

Another key point that we haven’t discussed is games. Games have been a staple of the living room since the days of the Atari 2600. With the iPhone, Apple owns the handheld gaming market. They have instant distribution. Why would they not take on consoles like the Wii? I suspect this was one of the other pushing points for the Apple TV.

I feel Apple really wants to be a distributor like the cable company. From a revenue perspective, it is a market they aren’t capturing like they could. I think they’ve always wanted to be there. How they approached it has changed and evolved, but their underlying goal has been consistent.

With the TV market shifting how it has, other devices seeing success, Apple’s desire to compete with consoles, and most importantly consumer awareness and desire to bring interactivity to the TV experience laid out a perfect time for them to make a play.

The product isn’t exactly what they wanted but it holds to the essence of what they’ve been striving for. Time will tell, but I believe it is missing a big key component they desire: a subscription/distribution outlet.


What are Apple’s plans?

So, if we consider that Apple’s original plans are incomplete, do we think they are still going after them? Absolutely.

Apple does hardware and OS level software extremely well, but they tend to trail a bit behind companies like Google or Amazon who excel at cloud based software. Netflix’s success has come from their cloud based subscription/distribution platform for video content. To compete and do well in this market, you have to have that. On May 28, 2014, Apple acquired Beats Music which did just that. For those who only see things at a surface level, one might expect the acquisition was done in order to get their line of physical headphone products. I think that was just an added bonus. In buying Beats, they got multiple elements that could play critical roles in their business. The most important one was the subscription service. We have already seen this take form with Apple Music. Their once dominant position in the music industry has faded with Pandora and Spotify. The service helps them address that AND it helps them with a future setup for TV. They will have an app just like Netflix or their own Music app that is a subscription service. My guess is they are still trying to work thru content relationships and they want it to be done right. In addition, having the device in the market will help drive discussions around those relationships.


Talk to me like “Her”

You may be asking, what about the software on the device itself? Is that what Apple intended. For years I’ve felt like one of the hardest pieces to tackle with the TV is the interaction model with it. Directional based navigation controls, ugh! Apple stayed the course there with the small addition of a swipe face for a little more control. Why do you think that is? Apple showed you why, it is voice. They understand that the living room interaction is limited (outside of games). Really you want to just find and consume something. The goal isn’t interaction as much as it is getting to the content and consuming it with as little effort as possible.

Siri has become more and more powerful over the years. It is no where near the level that we see in the movie Her, but that reality is coming. Amazon, Google, Microsoft and Apple all understand that, for consuming information and media, the best interface is non tangible interface. Think about how much easier and efficient a natural language conversation can be. What’s the weather today? What was the score for last night’s Braves game? Can you find that movie with Robert Redford about baseball? This is Apple’s true play for the future of consumption based interaction with computers. The TV environment lends itself better than any other to a voice interaction model.

The problem with basing interaction on voice is exposing content in applications that by their very nature hide it. How do you say I want to watch Orange is the New Black and the only place I can consume it is in the Netflix app. Unlike the web that was indexed by Google and liberated with its search, mobile apps have been silos. Google and Apple have both approached this problem recently with their App Indexing and Universal Link solutions. We finally have indexing of applictions. Just how search transformed the web, this level of content awareness will take OSs to an entirely new level.


How important are applications?

If you paid attention to the keynote and how content was structured within the OS, something also stuck out. Not only was Apple providing an easy way to jump into content from search, they curated the results in a similar fashion to how Google does on the web. To see what I mean, do a search for Big Bang Theory. Rather than simply list out various apps that might provide the show, they created a buffer page with show details too. If you are using voice as your interaction model and your goal is to just consume content, applications start to become less important to an extent. It should be very interesting to see what impact this has.


The Apple Gaming console

My particular interest with the TV obviously is how it relates to media. As I mentioned the other driver for Apple is games. It is the highest revenue stream they have in the App Store. Games and the living room have been like milk and cookies, a perfect match. It boggles my mind that Amazon and Google haven’t put more emphasis on growing this on their TV based platforms. I know it is in their sights, but it seems they haven’t put a large focus on it yet. Obviously the big draw for them has been the cheapest way to get Netflix on the TV, but man theres a huge opportunity there. Apple consumers will expect it there and developers will see the potential. It is going to happen. If you jump back to Apple’s WWDC conference in the spring, you will see they have big plans. As they were showing off new games being developed in Metal (their programming answer for taking game graphics and processing to the next level), it was obvious the level of games being shown were on a level that didn’t make sense for the casual expectations of handhelds. Apple is looking to take on the Xbox and Playstation.

Should we be excited?

Apple has always been great about entering a market at the right time and using their marketing and approach to transform it. When Apple makes a play, the general public accepts it more than any other companies attempts, because they perceive it as a ground breaking and are made more aware of it thru stellar marketing. They aren’t doing that much more than the others out there, but the public’s attention to it and the developers willingness to get behind it will open this market up.

I have a feeling that it probably scares the MSOs. I would not doubt if content companies put their TV Everywhere apps on the platform, you don’t see some choice providers missing from the list. Once the general public starts to understand that they can have the same ondemand experience on their TV as they have had with their other devices, it changes everything. It exists now, but for the mass public it’s not there. That perception shift will make people question cable boxes. Change is inevitable though.

I personally am excited. I’m not an Apple Fanboy, but a fan of progression and the ability to create great things. Apple tends to open up doors and that makes me happy. I’ve been waiting for a long time for the TV to change. In the background, it already has. It has died little by little, but there is hope.

Just like Apple, at Dreamsocket we’ve experimented thru the years. In November 2006, we worked with Playstation prior to the PS3 launch to dream of what a TV and game experience might look like merged. When asked why the project was important to me, I used the opportunity to try to throw my ideas at them. I noted that for me as a developer, I wanted Sony to create a rapid development model that could allow anyone to publish to their device. I also mentioned that there was no need for discs, digital distribution over the internet direct to the device made a lot more sense. I just wanted to see the opportunity to be in the living room. I’ve always wanted it. There is something about your having your app on a big screen.

Over the course of the next 9 years, we took other projects here and there to test the waters. When the Google TV came out, we built apps for it. Getting an opportunity to help create the actual interface for a gaming device itself, we jumped on it. Exploring how a phone could interact with TV, we made a play at it.

TV has been waiting to be modernized for years. Knowing that the new Apple TV was coming out, we developed new applications for some of our media clients on the Android and Amazon Fire TV devices. They were actually strategically targeted to come out the week after the Apple announcement and a few days before the Amazon one. They haven’t released yet, but they are coming down the pipe.

Why did we do this? Knowing that Apple has gone from experimentation to going after the TV market, I knew that unlike anyone else, they have the capability to be a catalyst for the space. I’ve hated to see TV living in legacy for years. Even though the living room is less dominate now, it still is the first and final frontier for entertainment. Most importantly developing for the TV is fun.

If you managed to read all of this, congrats and thank you ;). I could go into much deeper levels on many different points, missed a lot of things that I also feel are relevant, but it gives you a peek into my brain. That peek may be right or wrong on things, but it shows the importance of trying to understand the big picture. If you can do that, you can see how things emerge and you can spot opportunity. I’m excited about the Apple TV.

The Rise of the Superhuman

To preface my ramblings to follow, I’m in a love/hate relationship with modern technology. With every year, we become more empowered, yet we become more isolated and social inept. I have faith though. I believe that at the core of every human exists a desire to do good and an ability to achieve greatness. Therefore due to my faith in humanity I have faith in technology. If advanced and used correctly technology can make us better, it can bring us closer. There is a dark side, but we must look to lead ourselves into the light.

So where are we?
We are in the age of evolution. We are rising as humans, we are augmented. Our intelligence is no longer limited to our internal minds or our direct surroundings. We are able to bring in knowledge from social collectives, from endless repositories. This knowledge travels with us, guides us. We don’t think about it, but it exists today. How? We walk around with small gateways into super computers. We walk around with the “smart phone”. These little devices have transformed our capabilities, extended what we can do, and in short made us super human.

What do you mean super human?
If you are thinking that its just a phone, you are sorely mistaken. Whenever I talk to people about transformation, I always cite my move to San Francisco in the early 90s. Bare with me as I take you on a little story down memory lane. As a teenager, I lived in a very different era. The PRE INTERNET era. There was this mystique about California growing up. It came thru magazines, movies, and word of mouth. These mediums were my gateway into the world of skateboarding. I longed for the day that I could explore the world that I could only imagine. My mediums were very limited. My gateway just a mere peep hole. After months of saving up, my only insight into a city that I had never been to was a few archaeic books from a local library and very small map to get me across the United States. YES, that is all I had to go on. I drove night and day across this great land of ours, from the small rural town of Kings Mountain, North Carolina to San Francisco, California. I didn’t know how to get around the city, where to stay, what part of town was good or bad, where to eat, I was basically clueless. I just knew one thing, I wanted to be there, I wanted to experience it. When I arrived I fell into a complete state of shock. I really didn’t know what I was going to do. I had to figure it out on the fly. Luckily for me, when put in a state of survival I can quickly figure things out (which I must say is in stark contrast to my usual dwelling on trying to make the right decisions).

So now jump forward to the present day. A few years ago I revisited my beloved city for business. This time I had my super human capabilities in tow. Before I even got there, I knew how the city had changed. I knew that certain areas I wouldn’t dare step foot in before, now boasted some of the best restaurants. I knew exactly where to stay in relation to where I needed to be. I knew who else would be there while I was there, where they were staying, and was able to arrange places to meet that none of us had ever been to. The kicker is how it augmented my intelligence in real time. My little brain in my pocket gave me voice step by step instructions of how to get where I needed to be. It told me the status and location of my friends. I knew everything that I wanted to know.

The internet has expanded our knowledge. We can research to the nth degree any subject we want to know about. We can have conversations with and meet people from any place on the planet. We can sit on the floor in a small room in central Kazakhstan and work on some of the core technology components to the largest media companies in the world (YES, I’ve done that). It is really amazing how the world has changed in the past decade. We are more informed and we are more connected.

So where are we going?
We’ve augmented our intelligence. We’ve put no limit on our communication and social boundaries. Our non physical nature has been changed forever. The next step… our physical being. I’m not talking about putting computers within us. I’m talking about using computers to watch us, to inform us, to make us better physically. Ever heard of the Fitbit? A heart rate monitor? These are very simple augmentations. We can do better, much better. I think we are about to see just how much better we can go. Wearable technology has become the in thing to go after recently. No one has hit the nail on the head. Everyone is pretty much fumbling the ball. However, everyone knows what it can be. It reminds me of pre smartphone talk. During those years leading up to the iPhone, there was always talk of how mobile was going to change everything. Yet, nothing delivered. It took someone thinking about it from the users perspective, molding it to that concept, and doing it right. That changed everything. We are about to see that happen again.

What in the world am I talking about?
I’ve been doing a lot of research over the past year. On top of just looking things up, I’ve tried out several apps, bought various accessories, and really tried to see what was happening as a whole in the industry. I belive it is somewhat infant, but biometric engineering is probably going to be one of the fastest growing industries in modern times. All the elements you need to really make something revolutionary exist today. They are fragmented or partial, but they exist. Biometeric and motion sensors are probably the most important.

Imagine for a moment if you will, that you could analyze your blood in real time. What if you could check glucose levels, determine oxygen levels and more all without being physically invasive. If you could analyze all these biometric readings you could probaby in turn create a biometeric signature that was tied specifically to a particular person. You could digitially ID them. In addition, more finite motion sensors that were attached to you could determine a variance of you bending over to touch your toes and you bending over to deadlift a few 100 pounds. Now take those readings and tie them into extremely intelligent system. Are you seeing it at all? YES, you could become super human. You could remove questions, you could get intelligent feedback to make yourself better, physically better.

If you could read all these things, you could know in real time how food effected you. Ever known someone’s kid who went nuts if they didn’t eat, but never really knew when they were going to go over that edge? You could be informed in real time. Ever excercised and felt like you weren’t improving? You just didn’t know how far to push. Did I lift enough, did I push hard enough? Ever known anyone that was on medical watch? Remove all the devices, let them go free, let them live, but let them live knowing that if something came up they weren’t alone.

Now also imagine a device that could store all this info or at least retreive all this info and be digitally tied to you. Have you ever gotten hurt and tried to get your digital records to take to another doctor? What if it were all available on your wrist. To that note, what if you got in a car wreck but had your entire medical history with the flick of a scan. How many ID cards do you carry? Drivers license, passport, credit cards, airline ticket, rental car info, hotel info, the list goes on forever. Imagine walking into an airport with nothing in your pockets, scanning your wrist to buy something, scanning your wrist to go thru security, scanning it to check in for a flight.

This future isn’t that far off. It really just takes someone putting all the pieces together. It takes someone doing it right. It takes someone that can get the masses to adopt it. Over the past year, my research has led me to believe that Apple has been thinking about all of these things. My guess it that they may have been thinking about it right. My hope is that they have, my hope is that they create something that can make us better, that can generally improve the physical nature of people’s lives. If they haven’t OR they fall short of my visions, I hope that someone does.

Smile 🙂
Technology can be full of light or it can be dark. I’m optimistic. I’m a dreamer. Here’s to the light in all of us and all the little things that can help it shine.

Adapting Dreamsocket

As we head into the future, it always seems habit to reflect on the past. This year we silently redid our company site reflecting the past, showing the present, and opening up to the future. It was appropriate given so much has changed since our inception. What is even more interesting is that these dramatic shifts in our business happened almost overnight. However, it wasn’t that we just spun the wheel in a different direction and completely retooled and refocused when we woke up the next day. Luckily for us, we set ourselves up for a transition and it paid off.

What am I blabbering about? How does it apply to me? If you are reading this, then you have some level of interest in who we are and what we do. You either want to know how we tick or what prompts some of our success. The reason I’m blabbering is that to be successful in anything you have to be willing to adapt and either be smart enough or lucky enough to adapt to the things that will bring you the most success. We managed to do a little of both and land up in a really good position.

In our case, we originally found extreme success in fulfilling a very niche need. That need was building custom online Flash video products for large media organizations. Knowing their business, needs, systems, and being able to deliver on that has made us a valuable asset in that space. Over time that space has matured, slowed down, and shifted. As this was happening, we had already planted seeds as a company that grew more rapidly than we could ever have imagined.

The first seed was an internal project codenamed Poor Bear, which eventually became our first iOS game Bear on a Wire. It was our foray into a new language, platform, and online business. We viewed it as an awesome way to just play around with technology and see what we could learn from it. We didn’t hold it up as a golden egg or anything beyond that. iOS was an interesting platform and after a completely non technical friend called up raving about his iPhone and how incredible email was, we knew we had to do something with it. I mean if someone’s intro to email is thru an iPhone, then there is something to be said about that. So we poured our hearts into the game, working on it in spare moments and exclusively at others. A lot was learned, fun was had, and in the end we had something great.

The Bear wasn’t your runaway Angry Birds, the golden goose wasn’t sitting in our office, and we don’t have islands YET (emphasize, YET because one day we’ll be having pirate wars). An interesting thing happened though, clients were interested in what we did. You see, we weren’t the only ones wanting to build things for our pocket toys. The problem was budgets sometimes have to wait to follow interests. Everyone was becoming interested, but not everyone had shifted their dollars yet. So as the year closed out, with a few dollars left on the table we were given a few chances to play around for our clients. Even though the Bear only pulled in enough for a nice dinner every month if we were lucky, it was our key into the next room. Our game showed our clients that not only could we play in the space, but we were quite good at it.

With a key in hand, we first peeked thru the door, then someone swung it wide open. The budgets came in. We went from developing our first mobile app to dozens of iOS and Android apps overnight. As the online video space cooled, the mobile app space became red hot. That is great and all, but putting a chicken in an oven without knowing how to cook it, no one is going to want to eat it after you’ve burnt it 5 times in a row. The quality just isn’t there. No matter what we are working on, we CARE and we put 200% effort into it. Details matter. This was no exception and helped us really win a lot of trust from our clients in this space. We are viewed as partners rather than simply providers and continue to deliver new versions of flagship apps as time goes on. In addition, unsatisfied with original vendors, we’ve had a lot of clients bring us apps that we’ve been able to take over and really improve upon. Having a relationship like that is probably the best thing we could ask for, because it shows us that our work is really appreciated. We feel acknowledgement that our clients can see how much we care and put forth.

The moral to this story is that things are always changing and that you should always play it smart and look for new and interesting things to explore. If you play your cards right or just luck out (maybe a bit of both), new opportunities will come to you that are bigger and brighter than you would have ever imagined.

The new reflects this. It shows a little of what we’ve traditionally done, some newer things that we’ve gotten good at, and a glimpse of future things to come. We have had the great fortune of working on just about every digital medium that we could hope to. That is what turns work into FUN for us and keeps us coming in! We’ve been very lucky. Check out our portfolio to get a small glimpse into the fun times that our clients have given us.

Bear on a Wire (previously Poor Bear) IPhone game released

Bear on a Wire Trailer from bearzo on Vimeo.

The Game

Our first IPhone game, previously code named Poor Bear is officially available in the app store today under the name Bear on a Wire. For those of you who followed the progression of the game on our site (1, 2, 3, 4) know that this game didn’t start with designs, requirements, deadlines, or the promise of gold bars. Instead it was built on the premise that we could make something fun that we molded just how we wanted it. That mold shifted and turned over time. Even at the starting gate, we didn’t even know what type of game we were making. The game really grew organically and took on a life of its own. I’m personally blown away with the outcome, especially considering this was Chad’s (the developer) first game and he went into it not knowing Objective C. The design is a work of art as well. However, for those of you know Trevor (the designer), know that you could expect nothing less. Words can’t do justice to what 1 designer and 1 developer have done with this game. It is simply amazing and even though it is our own game, none of us can stop playing. That was the point though. We built something we loved. We hope you will too!

Support Us

We appreciate any support you can give us. For those with an IPhone grab the game now, rate it, and review it!

For those wanting to get the word out. Here are some links to blog, twitter, AIM, tell someone on a subway, etc. We will have flyers too that you can print and post on bathroom walls, telephone polls, and anywhere in eyes view.


Press Release

Dreamsocket & TVM Studio are excited to announce they have just released Bear On A Wire.


Apple app store link:

About the game:
Our green hero, Bearzo, has had it! No more performing for “THE MAN” day in and day out. What! Do you think he is some kind of dancing bear? NO… he is a high wire bear, and it’s time for him to make his great escape from the Big Top. He loves his fans and his work, but he just wants to be free and feel his scarf blow in the wind as he shreds wire with the most insane moves ever attempted … on a Moped… on top of high voltage power lines. Get ready to feel the power of the 49cc, two stroke, and single cylinder stallion!

As you tear off on the wire, try to balance Bearzo and keep him from fallingdown into the 1.21 gigawatts that alternate through the wires below him (Ah, the smell of burnt bear hair). While balancing on the wire, acquire crazy mad points by using the different stunt key combinations to generate some MOPED MAYHEM ( Bearzo’s stunts include no hands, half twist, full twist, bear buck, back roll, front roll, jump roll, grinder, spin roll, spin buck, spin buck grinder, coat tail, coat tail kick, and the next to impossible coat tail kick spin grinder. Combine these stunts with full flips, double flips… triple flips…? Now you are just being crazy! Collect coins and rack up even more points. I know…you never saw collectable coins coming. Don’t get caught hibernating b/c it’s about to get all GRIZZLY up in here!

Get pumped for BEAR ON A WIRE.

Poor Bear Update 4: Collision Detection

I have been working on adding tricks to PoorBear over the last week. Trevor has sent us a ton of crazy animations for tricks (I will try and throw up a video preview of some of them soon) as a result, I was in desperate need of a way to generate collision verts in a manner other than plotting them by hand (yeah, I plotted and translated the verts for one animation by hand and it took about an hour). I will go over the method I used to solve this problem.

The problem

I am using the Chipmunk physics engine on PoorBear and very basic collision shapes for all objects to try and keep it as fast as possible. You can make very complex objects with Chipmunk, combining many circles and polygons, but I feel that a single convex poly will provide accurate enough collision for this particular game. So, PoorBear’s body and scooter are represented with seven verts which are depicted by the orange dots below:

Up until we decided to add tricks to the game these vertices were all that was needed to provide collision detection for PoorBear’s body and scooter. The only variation in the animation was the movement of his scarf which didn’t need any collision detection so the verts remained static. Since we want to be able to do some pretty crazy tricks, these verts are no longer sufficient because the trick animations aren’t close to the shape of these verts. For example, this is a frame from one of the tricks:

As you can see, not only is it a totally different shape but the image is a different size and the wheels and shocks of the scooter are also included. This particular animation has ten frames and each frame is different enough to warrant unique collision verts. We have around 15 different trick animations with as many as 40 frames each which is why it became unrealistic to plot the verts by hand.

I browsed the internet for advice on how to solve this problem, but being new to game development I wasn’t even sure what to search for. Someone mentioned to be that I could use a single color image to generate terrain when I was working on the level editor for PoorBear so I started thinking how I could implement something like that and apply it to this problem. The solution turned out to be simpler than I had imagined and has saved me a ton of time.

The solution

My solution was to open each frame in Photoshop, create a new blank layer, and draw a 1px black (could be any color) dot where I wanted each vert to be. Then simply hide the original layer and save the layer with the dots to a file. Then I wrote a little script with my scripting language of choice, Python, to grab each pixel and convert them to coordinates that I can use with the physics engine. I have since extended the script so that I can generate the verts for multiple frames and animations at once because all the animations for PoorBear are complied into 1024×1024 images to cut down on the number of textures being swapped. However, I will just go over the basic steps of generating verts for a single image.

Below is sample of the previous image with several points drawn to form a loose polygon for collision detection. The points are overlaid on the image so that it is obvious what they represent and the points are large for the sake of clarity. The red dots have to be 1px on the real thing.

Once the verts are pulled out of the image and converted to something the physics engine can understand, they will represent something like the following in the game:

The script

There is a great image handling module for Python called Python Imaging Library (PIL) which is required for this code to work.

First we need to open the image saved from photoshop and pull out the pixels that were drawn. This can be done with the following code:

def grab_points():
image ="/path/to/image/animation0000.png")
pixels = image.load()
width = image.size[0] # the size property is a tuple (width, height)
height = image.size[1]
points = []

for x in xrange(width):
for y in xrange(height):
if pixels[x, y][0] == 255: # could just as easily detect any color
# pixels[x, y] returns the tuple (R, G, B)

This code opens the image and loads the pixel data into memory and steps through each pixel, adding the ones with a red channel value of 255 to the points list. We could have also used a list comprehension which most likely runs more efficiently but is considerably less readable. This is what it would look like:

def grab_points():
image ="/path/to/image/animation0000.png")
pixels = image.load()

points = [[x,y] for x in xrange(image.size[0) for y in xrange(image.size[1]) if pixels[x, y][0] == 0]

Now that we have the location of all the pixels we drew in Photoshop, we need to convert them to something the physics engine can understand. Getting the pixel data was very simple thanks to PIL and at this step these points could be used for any physics engine with the right translations. These next steps will be more and more specific to my situation (Chimpmunk physics and the iPhone) but can be adjusted to most any project.

Chipmunk expects the verts to be in clockwise order and to form a convex poly. Currently, the verts are ordered by their x value. Given the image below, we need the verts in the order ABCDE but they are in the order ABECD right now.

I developed a simple algorithm which arranges the verts in the correct order, it has four basic steps:

  • 1. Iterate over all verts excluding the first and last
  • 2. Remove the verts with a y value less that half the height of the image, saving them in a temporary list
  • 3. Reverse the order of the temporary list
  • 4. Append the temporary list onto the original list

This is the code that does that:

def sort_points(points):
length = len(points) - 1
temp = []

i = 1
while i < length:
y = points[i][1]
if y > HEIGHT / 2:
length -= 1 # we are editing the list in place. since
# we popped a value, decrement the length
else: i += 1


[points.append(point) for point in temp]

At this point, we have pulled the pixel data out of the original image and sorted the points in an order that the physics engine will understand. Now we just need to translate the points to the coordinate system used by the physics engine. The pixels were stored linearly in the pixels list where the first pixel in the list represented the top left pixel of the image and the last represented the bottom right pixel. This can logically be thought of as a coordinate system with the origin in the top left and the positive y-axis growing downwards. Chipmunk uses the traditional coordinate system with the origin located in the center. We just need to loop back over every point and transform them to coordinates Chipmunk understands. This code will do that:

def transform_points(points):
for point in points:
x = point[0]
y = point[1]
point[0] = x - OFFSET_X if x > OFFSET_X else (OFFSET_X - x) * -1
point[1] = (y - OFFSET_Y) * -1 if y > OFFSET_Y else OFFSET_Y - y

Now we have the list in an order and format that Chipmunk can use. Being that PoorBear is running on the iPhone, I just format this data to resemble a multidimensional C array and copy/paste it over into the code for the game. There are better ways to get the data over but copy/pasting is good enough for now.

The full script is below, I just chained each function together for simplicity.

from PIL import Image

OFFSET_X = 75 # image width / 2
OFFSET_Y = 75 # image height / 2

def grab_points():
image ="/path/to/image/animation0000.png")
pixels = image.load()

points = [[x,y] for x in xrange(image.size[0]) for y in xrange(image.size[1]) if pixels[x, y][0] == 0]


def sort_points(points):
length = len(points) - 1
temp = []

i = 1
while i < length:
y = points[i][1]
if y > OFFSET_Y:
length -= 1
else: i += 1


[points.append(point) for point in temp]


def transform_points(points):
for point in points:
x = point[0]
y = point[1]
point[0] = x - OFFSET_X if x > OFFSET_X else (OFFSET_X - x) * -1
point[1] = (y - OFFSET_Y) * -1 if y > OFFSET_Y else OFFSET_Y - y

# format the list if needed
file = open ('output.txt', 'w')
file.write (points)


Poor Bear Update 3: Development Progress

From our last post, you got a glimpse into Trevor’s mind and where things were headed from a design standpoint. This update shows the concrete transition of those elements into the game. The title screen and elements have been incorporated, along with stunt recognition (flips, wheelies), item collection, and finer game controls.

Help make Poor Happy!

As you can tell, Poor Bear’s life is starting to hype up, but we have him running in different directions. He is a little confused and we’d like your input. How? Well we’ve played with different goals for Poor from beat the buzzer to collect and score. What do you think would make Poor fun? Either of those? A combination? Or something else? We have tons of ideas, but would like to know what you think in terms of the fundamental game play. Please share your thoughts in the comments or with us directly!

Poor Bear Stage 2: Design

Project Poor Bear is shaping up pretty nicely, with every day bringing in a little more “flair”. Not to leave you in the dark we wanted to post some of the design progress and process. In addition, Trevor the man behind the “flair” kindly took a step back to share some insight into the origins of his ideas for the game and process. Below are images and direct excerpts from Trevor’s mind, enjoy!

“The idea behind project poor bear is to mesh a couple of my favorite ideas with the awesome game engine Dreamsocket came up with. First, I love the idea of making a monumental escape from work. I think anyone can relate to that. Here we have a bear who is good at what he does, loves to do his thing, but is stuck inside the unrelenting system (the big top) that sucks the fun out of his work. Sound like your life? The next idea I’ve incorporated is one I’ve been sitting on for a while. As a kid, I used to skate board. Trust me, I sucked… real bad. Even though I didn’t have any thrashing skillz, I could imagine the possibilities. I would always look up at the power lines and imagine ridding a skate board, or bike, or rocket boots on the power lines. Man, that would be so cool. I don’t think there could be a better escape route for our high wire bear. Of course we can’t just let our poor bear ride off into the sunset all golden parachute style. We have included a few challenges that should make for an interesting ride. 🙂

For the overall look, I wanted to make something that was just really fun to to interact with. I was initially going to make something all stylistic and out there, but it has been a while since I just made something tat felt cartoony. It’s almost a bad word these days. “CARTOONY.” Yeah, I feel bad even typing it. Well, I’m glad I did, because I think it really has a nice goofy look.”

Splash screen development

The development of the splash screen from a hand-drawn image to the (almost) completed image.

In-game assets

Potential objects you can expect to see in the game, from deadly barbed wire to not-so deadly squirrels.

Level one intro video

A sample intro/theme video we are using to defined the game. It shows Poor Bear making his death defying escape from the circus.

Kenny Speaking at NAB2009 on Games and TV Collaborations.

I was asked, agreed, and received confirmation that I will be speaking at the NAB conference in Vegas this month. The presentation/panel is titled “Game & TV Collaborations” and is focused on solutions that integrate games with video based entertainment. I will be showing off the Playstation Megasode that we built a few years back and participating in the subject discussion.

The presentation is slated for Thursday April 23 at 10:15.
You can find all the details here.

It should be quite interesting presenting at NAB, since the crowd is so different than a lot of the places I speak. It also marks my first attendance to the event, so I’d love to hear feedback from others that have attended in years past.

Annoucing Project Codename: Poor Bear

The iPhone bug bit and we have started working on a side project code named POOR BEAR. The project is a small iPhone game collaboration between the folks at Dreamsocket and <a href=”” target=”_blank”>Trevor Van Meter</a>, who we consider friend of the family. For those who aren’t familiar with Trevor, you may remember his game <a href=”” target=”_blank”>Fly Guy</a> that garnered a lot of praise. Trev and I (Kenny) actually went to school together and were part of the same crew of friends, so we have roots. Personally, I look at him like a renaissance guy when it comes to illustration work. He literally does it all: cartoons, toys, comics, games, you name it. Needless to say, we are excited to be working together. Right Trev 😉


Trevor is working out the dynamics of POOR BEAR’s story line and it will evolve as we progress. Nothing is nailed in stone, but we have our basics. Our main character is a circus bear who rides bikes on tight ropes. Pretty hype already, huh? Who doesn’t love bears on bikes??? The game starts with our fearless character blasting out of a circus tent and landing on some power lines. He has some where he has to go, and he has to get there fast, so it is up to you to help him get there. A side scrolling bearrific race against time!

Current State

<div style=”width: 500px”><object type=”application/x-shockwave-flash” data=”” width=”500″ height=”370″ id=”swfobject-1″ style=”visibility: visible; “><param name=”allowFullScreen” value=”true”><param name=”base” value=”http:/”><param name=”bgcolor” value=”#000000″><param name=”type” value=”movie”><param name=”flashvars” value=”file=;image=;google_analytics_id=UA-7275490-1″></object></div>

You can see from the video above, our development started by just getting basic mechanics working in the game along with a few art elements and menu screens. Some how the luck of the Irish hit, and we already have our fearless hero iPhone ready and controllable via the accelormeter and screen interaction. Tilting the phone to either side controls balance, touching the front of the phone speeds him up, and touching the back slows him down. Although some of the game elements are limited now, they along with the terrain are getting generated from a custom level editor we built in AIR. This gives us the ability to really map out and throw things in pretty rapidly. Expect good things 😉


Moving forward, there are many hours of work left at this point. We still need to settle on layouts for all the menus and screens, get art for them, add sound effects, add background music, and create levels for the game. Perhaps one of the most important tasks left is to implement a scoring system. We have not ironed out the details yet but are considering making the levels time-limited and scoring based on quickness of MR BEAR. If we go this route, things like power-ups via time bonuses and speed boosts come to mind. Finally but not least, our bear FLIPS, yes he FLIPS while jumping his bike! Obviously, we might have to take that into account too!

All that said, welcome to PROJECT POOR BEAR. Expect us to post our process and progress here, so you can follow along and add your 2 cents into the game. Help us shape it into something great. We want POOR BEAR to pull you into the bears shoes and have you craving to play!

Teams, Chad Fuller, and Business Investments

Owning and running a business, the most important elements to your company are your image and the people working with you. This is even more important when you are a smaller business. If you are surrounding by the best of the best, that becomes the perception of what your company is. A small agile company composed of experts is a lot different than a large company with a few experts and a lot of worker bees. Both are valid models and neither is right or wrong. I opt for the quality over quantity approach, regardless of the income difference.

Therefore, when looking for folks to work on projects or to join the team, I look for people that are:

  • a) smarter than I am
  • b) completely devoted
  • c) care

It’s a decision that I don’t take lightly since I’m essentially asking someone to join a “family of friends”. That’s how I view work. It is part of your life, the people around you are part of your life, and you should surround yourself with those that bring out the best in you and themselves.

Chad Fuller

Last June, Mr. Chad Fuller sent me a note mentioning that he was moving to Atlanta. I knew Chad well, knew how smart he was, but also knew that he didn’t have any work experience. Point blank, experience is huge for us. Due to the positioning of Dreamsocket, we typically receive jobs I would refer to as high experience work. Thus, we can’t have people work on the projects who don’t know the technologies better than they know their own name. It is our position and what we’ve built the business on it. So Chad was in a way a gamble. Obviously there is risk with any gamble. You either win or lose. However, I took a pretty calculated gamble and came out ahead…. way ahead. If I were in Vegas, I would probably be the owner of the Wynn right now ;).

How did I win? Instead of throwing projects at Chad he would tear his hair out with, I decided to invest in him and the company. Chad’s first project was If you haven’t looked at the site yet I highly advise that you do. Not out of self promotion, but to see what he accomplished. Before the project, Chad had never touched HTML or built a website. After the project he could boast a site that included a store front, live docs, bug tracker, and more all under one dynamic system. Needless to say, I’m more than impressed. Being able to own and shape it himself, Chad really was able to take value in his creation and learn a lot (at least I think he did ;)).


Since the site was an internal project, it was an investment. We invested in defining our image more concretely, creating a way to extend our business, and developing ole Chad. Personally, I know what its like to run in his shoes. Developers that care want to learn as much as they possibly can, to work on great things, and just enjoy what they do. It felt really good to give him a project that he could call his own, mold it, and learn from. That is really what being a business owner can do for you, it can help you help others.

As much as the business will let me, that is what I intend to do. Invest in the folks around me. If your folks have passion, let them run with it as much as you are able to afford. Your workers will grow in strength, which will in turn mean that you get an experience level you couldn’t get any other way. On that note, Chad got the IPhone bug and I’m letting him get all over it. It means diversification and it means he continues dealing with things he is really excited about. Wait and see what he’s got running ;).

Look for big things on Chad’s blog and our site