My Interest in Updates is Fading

Every year, Apple hosts its Worldwide Developers Conference in San Francisco. I used to be really excited about it because I would expect a brand new device, or at least a new version of a device that had amazing new features. Now it just seems like a dull affair. I say this mainly because Apple seems to be focusing on updating rather than innovating. For the past few years, each new iPhone has had improvements over the previous version, but nothing groundbreaking. I was thinking that Apple was at that phase where a company just doesn’t know which direction to go – should it take risks like it used to, or should it stay safe and focus on getting consumers through its brand? But now that I think of it, I think that the tech industry and its consumers are suffering from the “yearly-update” syndrome.

What I mean by this is the fact that we as consumers expect our devices to be upgraded consistently, or our software to be updated periodically. We can’t get excited about developments in technology anymore because upgrading is now a standard that every company must have; otherwise, people will switch to whatever else is new. Even when we talk about huge software updates, like an operating system upgrade, no one really seems to care. Mac users get their OS upgraded FOR FREE, and I remember getting excited for 5 minutes before losing interest when it happened last year. 5 years ago I would have gone crazy for a new OS, especially one that has the features that the current one has.

Maybe it’s just a sign of the times, or maybe it’s just human nature. We get used to things and then forget about them. I keep thinking how we’ve become accustomed to touch screen phones, voice recognition, and rapid online connections when all of this didn’t even exist 7 or 8 years ago.

And this type of thinking effects everything digital – even when browsing a website. For example, if a website doesn’t have a clean design, I will immediately assume that something is wrong with my internet connection. Either that or the people running the website need to fire their designer. We as consumers have expectations, which is understandable since we’re paying money for our products. But with these expectations comes the understanding that we can’t really appreciate the capability of an iPhone or the intuitive nature of a laptop. It’s the beauty and curse of standardization.

Conspiracy Theories

I love conspiracy theories. They either give you a sense that something’s severely wrong with the world, which feels thrilling (like in a movie or something), or it offers explanations for why things that you already know are wrong exist. Some of these theories are plausible, and probably have some truth to them. Others are just INSANE:

aliens-meme

But there’s one conspiracy theorist that has made a name for himself, and has clearly devoted his entire life towards making conspiracies mainstream. That man is Alex Jones. I don’t agree with a lot of what he says, but a few of his arguments have some clear evidences for them. And Alex Jones usually sources his material from mainstream news sites, so even if you disagree with him, you know he isn’t trying to make stuff up out of thin air (at least from what I’ve seen; I could be wrong). And you do get a sense that he is sincere about what he does. Whether you hate him or love him, one thing is for sure: he has effectively used the Internet to distribute his material. First, he has his radio show, Prison Planet, which is live-streamed and video-recorded. During his radio show he covers all of the relevant material for the news that day, and talks to listeners about their thoughts on the subject. He will also spend hours interviewing experts (or “experts”) about a particular topic, such as Obama’s plan to start World War 3 with Russia, or the possibility that the U.S. government is creating nano-robots to sneak into our mouths and inject fluoride on our teeth. Yes, the range is that severe.

Alex Jones also has a website called Info Wars, which is his news aggregate site. This is where he will link to current articles involving subjects that he or his team find important. And of course, like all good conspiracy theorists, he uses YouTube. The thing about watching a conspiracy video on YouTube is that, after finishing it, there is always a hundred other recommended videos that look really interesting. So you end up watching a ton of videos in one go and then end the night huddled under the covers.

Conspiracy theories have flourished online because of the tendency for people to search for things that they believe (again, the confirmation bias). But their popularity reflects something else in society. For example, I mentioned in a previous post that the controversy about vaccines “causing autism” has become a mainstream issue. And in other recent new, the debates about guns in the US have reached a boiling point, and conspiracy theorists believe that the government is trying to take away all access to guns so that no one will be able to defend themselves if there is a military take-over.

So why are all these conspiracies becoming so popular now? I think that it has to do with a general distrust with government as a whole. If we’re talking specifically about the U.S., the NSA scandal didn’t help to quell fears about an ever-watching absolute government. And the many revolutions occurring across the world demonstrate an overall antagonism with the established powers. In both of the cases, the Internet helped play a big role in facilitating people to bring out theories about why these things were happening. I guess our increasing dependence on the Internet means that we will have to constantly deal with ideas about imminent wars and governments ruled by aliens.

 

The Music Industry and Internet Piracy

Remember the Napster controversy back in 2000? Good times. But for those that don’t know, Napster really pioneered the whole online music piracy business starting in 1999 (it wasn’t the first, but it was by far the most popular). But the Recording Industry Association of America (RIAA) caught on to its methods of transferring MP3’s across networks and filed a lawsuit against the company (and Metallica played a huge role in attacking Napster as well). Napster was forced to shutdown and pay up a ton of money, and now the name is merged with Rhapsody.

I think online piracy is something characteristic of our time, an era filled with obtaining a bunch of free programs, movies, music, games, and information. But I want to focus on music right now, and what better way to start than to address how piracy has affected the world view of the music industry. And just so people know, I actually sympathize with the music industry on a lot of issues, although I disagree with how they are going about solving them. Everyone needs to understand that music both as a recordable art form and as a file-type is UNIQUE compared to EVERY OTHER RECORDABLE PIECE OF ART. I’m not trying to sound arrogant, nor am I trying to say that music is superior to movies or video games. I’m merely trying to say that the way we consume music is completely different than the way we consume, for instance, Breaking Bad or Frozen.

First, the value of recorded music comes from repetition. You don’t listen to a song you like only once and then walk away. You listen to it again and again, often until you get it stuck in your head.

Second, recorded music is portable and requires little attention. Yes, of course all file-types are portable in the sense that they can be uploaded to any computer or device. What I mean by portable here is that music can be played in a variety of places with no real time investment – you can play it on almost any device at home, in your car, or at work while doing other things – you don’t need to be focusing your energy on the song unless you want to.

These characteristics have forced the music industry to make money in a limited number of ways, comparatively speaking. Keep in mind that movie studios make the majority of their money at the box office, something virtually unaffected by the Internet. Television networks either make their money through paid subscriptions or advertisements. And many video game studios are protected by console or hardware restrictions, although loss of sales are the most comparable to that of the music industry. But the music industry as a whole has almost of these luxuries – they have no box office (you might try and compare live concerts, but I’ll get to that in a bit), they have no way of placing advertisements on songs (people would stop listening), and paid subscriptions usually go to the website providing the streaming service. All the industry has is the sale of music and its concerts.

Before moving on, I want to address this latter point of live music. Remember that, when I say “music industry,” I’m referring to every part of the industry, including musicians. The average band makes between $20,000 and $50,000 a year, and because of the ease with which music is obtained freely online, the vast majority of this income is coming from live concerts. Try and compare that to the box office revenue  two days ago, when Godzilla made $93 million its opening weekend. The movie industry works with much higher budgets than the recording industry, and the payment of its actors are almost never dependent upon sales. I’m not trying to argue that the music industry doesn’t also have incredibly large concerts that provide a lot of revenue, I’m just pointing out that piracy has more adverse effects upon those actually making the music.

This doesn’t mean that piracy has in any way put a dent in the profit of the recording industry. In fact, because of the repetitive and portable nature of music, people tend to buy songs they like even if they pirated it, so that they can support their favorite musicians. Also, iTunes sales are absolutely fantastic, and I believe a lot of these sales are due to word of mouth facilitated by transferring music online. As a whole, I think the recording industry doesn’t need to worry about piracy, it just needs to focus on increasing revenues paid to its musicians.

The Entertainment Industry and Popular Culture

The 20th century witnessed a lot of distinctive hairstyles, many of which are associated with a particular decade:

LouiseBrooks3-438x560 19302-300x374 1940-436x560 1950-370x560 19602-300x378 1970-300x444 1980s-Perm-hairstyle-300x306 rachel-season2-300x420

Now you’re probably wondering, “why is he showing us pics of a bunch of women and a man that looks like a woman?” I’m trying to make the point that the reason we are able to connect a hairstyle to a period of time is the fact that something was determining the rise in popularity of a specific style within that respective time. So what was this determining factor?

The entertainment industry.

From the 1920’s up until the late 1990’s, people adopted hairstyles from movie actresses, TV stars, and musicians, leading into a “mass-adoption” of culture. But if you look carefully, I didn’t include a style from the 2000’s or after. This is because the post-2000 period actually saw a more eclectic adoption of culture, and therefore a wider range of hairstyles. Just to make this argument clear, Ensemble Magazine actually created a list of hairstyles through the ages (the source of the above photos), and they had this to say about the 2000’s:

“Anything is possible: Well, what can we say about the 2000′s… We’re a bit lost because the 2000′s don’t really have a distinct style. It’s more of a mixture of everything we’ve seen throughout the years.”

And now you’re wondering “well why did this happen Arman? Please shed some of your incredible wisdom upon us.” You wanna know what happened? The Internet happened.

Since the Internet allows us to freely choose what movies to watch and what music to listen to, our own preferences are enhanced. We have a more personal popular culture where we can indulge in our own cultural inclinations and communicate with people that enjoy what we enjoy. And with the online age, the confirmation bias (which I mentioned in my previous post) still plays a heavy role, but in this case it’s solidifying our own tastes and preventing us from adopting a much larger mass-culture.

But wait – doesn’t the entertainment industry still influence people, especially when talking about whats “in” or “hip” or “groovy?” Yes, of course it does. But guess what else is happening that didn’t happen before? The entertainment industry is now adopting the culture of everyday people.

This is easily seen in the music industry. For instance, where the 1980’s had a distinct synth-meets-rock vibe that was rigorously promoted by record producers, the post-2000 era saw a rapid decline in the ability of recording companies to control what music styles became popular. Instead, they’ve had to use two methods to make profitable music:

1) Reuse and recycle previously popular backing tracks, song structures, vocal tracks, or even lyrics. They may even mold them into a single musical style – this is actually how the current “pop” genre came into being.

2) Adopt musical styles that are becoming popular outside of the industry. The rise of dubstep is proof of this.

Since the entertainment industry has been deeply affected by the Internet, people working in the business have realized that they can no longer determine what’s popular, especially when places like Reddit or Twitter can change the world within seconds. Popular culture today has a more personal feel to it, and everyone has the ability to contribute to its development.

 

The Value of Scholarship in an Online Age

With the Internet came information, and lots of it. You can Google almost anything and you will probably get a result, or something close to what you were looking for. The accessibility and speed of websites allows us to find what we want and need in an instant, and if something is incorrect, we can quickly update it. And schools have caught on as well, even using iPads IN THE CLASSROOM. Seriously, high school must be awesome now.

But with all of this information, it can be difficult to figure out what’s true and what’s just plain made up. You’ve probably heard about the whole “vaccinations-are-causing-autism” fiasco, which is gaining more and more attention. If you haven’t, you should check it out. But what I’m getting at is the fact that the benefits of the Internet are also its drawbacks; we have easy access to so many many texts, blogs, and posts that it’s hard for laymen in a particular field to figure out what’s true and what isn’t. And this must piss off a lot of academics.

I mean think about it. If you’re a person who has spent their entire life devoted to studying animals, and some guy out of no where talks about the existence of the donkey-dragon, you’d probably laugh at first, thinking that the man just ripped the idea from Shrek or something. But then you slowly realize that his website has a massive following, and all of the people commenting on the front page, who are trying to warn people of the idiotic nature of such an idea, are all getting down-voted and blocked from the site. So how did it come to this? Well it’s a little fallacy that we all commit, and that some of us commit more frequently – the confirmation bias.

You see we all have an idea of what is true in our heads, so we tend to select evidences that support what we already think is true, and toss out everything else. So if people already believe that donkeys and dragons can mate (or want to believe in such a thing, which would be pretty cool and disgusting and awesome I must admit), they are going to search for information that supports the existence of such fire-breathing, load-bearing creatures. Even if a scholar in the field of biology presented clear proofs against the theory of donko-dragonian evolution, most of the people on donkeydragonsarerealfoshoscrewhaters.com would probably just ignore them. The Internet has given us the ability to filter out real knowledge in favor of false information that supports our beliefs, and to some extent, we all do it.

This doesn’t mean that academics are completely forgotten online. I mean many universities have open courses that give a college education for free online, and organizations like Khan Academy are doing an amazing job actually teaching people real skills with quick videos. It just means that, in an ever-expanding online world, people with knowledge have to shout a little louder than the rest of us.

 

Twitch Plays Pokémon

FYI: Twitch (twitch.tv) is a streaming website/platform that allows users to watch live coverage of other users playing video games and e-sports, or live-record  their own sessions so others can watch in real-time.

On February 12, 2014, Twitch decided to live-stream gameplay from Pokémon Red, an incredibly popular game which was originally released in Japan in 1996 and in the US in 1998 for the Game Boy (along with the Blue version). What made this particular session different from the usual real-time streams was the fact that it was interactive – Twitch created a custom program for the game so that, while the video of the game was streaming, people watching the video could type in a command into the chat box window. Users could type in a command, which were associated with the original Game Boy commands, and the program would immediately register the input into the game. This might sound confusing, so let’s see if this helps:

Nintendo GameboyThe program would associate commands with the original Game Boy (pictured above), so that the game itself could run like normal. So for instance, if a user typed in “A” or “Start,” it would register those exact commands into the game (as long as the commands were spelled correctly, of course). The picture below shows what a user would see: the game, the pooled commands, and the chat window.

twitch

Since the program running the game received inputs from any user, the single player/character in the game was essentially being controlled by thousands of people at one time, all competing to have their own input being implemented by the game. This made simple tasks incredibly difficult to complete, taking hours or even days to finish a particular area. You can see below how the game and character movements were being constantly bombarded with user inputs:

Twitch_plays_pokemon_animated

The live-stream and gameplay of Pokémon Red on Twitch quickly went viral, and at one point 121,000 people were inputting commands simultaneously, with millions watching. The game therefore became an online social experiment and sparked many questions – would users end up working together in order to complete a task, or would those with  less noble intentions (i.e. trolls) dominate the input feeds, making the game impossible to play?

The programmers behind the system developed voting programs so that, in situations that were impossible to get out of, users could switch to “democracy mode,” which, unlike the normal “anarchy mode,” allowed the game to compile inputs and only implement the one that received the most votes.  Over time, the wide popularity of Twitch Plays Pokémon garnered attention across many social media sites, like Imgur, Reddit, and 9Gag, with users asking friends to join them and creating their own stories, even mythologies behind certain events in the game.

On March 1st, the game was completed, with a total of 1.16 million participants and 55 million viewers. Since then, Twitch has been streaming other Pokémon games in the series, even those that were released more recently. But one question remains: why did Twitch Plays Pokémon become so popular so fast?

Of course, one reason we can point to is the fact that it was both interactive and live, so a user was literally able to play a game with thousands of other users, forming a community that was public yet somehow close-knit. But you would think that the vast amount of time it took to complete parts of the game would push people away, especially when there were points where the character was literally spinning around for a good 20 minutes. And the ability to play games online with other people online is definitely not a new phenomenon – I mean we have so many amazing devices and games now, with mind-blowing visuals and graphics, that are now required to have online access as an industry standard. So why the rush for a game that came out in 1998?

One word: Pokémon

It is impossible to explain the emotional attachment fans have for Pokémon, especially to people who have never experienced it themselves. This is particularly true for people who grew up with the Pokémon world from the beginning. As a fan of Pokémon myself, and an adult, it’s easy to talk about playing Pokémon games (or trading cards/watching the show, if you’re still into that) with other people who have also been fans of Pokémon for the past 16 years, mainly because it was a part of our childhood. But what’s interesting is the fact that Pokémon has remained a popular phenomenon for so long, and has actually grown substantially since the 90’s. And the brand itself has gained popularity on many social media sites, even those with a primary user base of people who are in their 20’s or older. Even though Pokémon games have changed over time, improving gameplay and graphics, they have maintained ties to the original gameplay in Pokémon Red that kids back then loved and still love as adults. This could explain why Pokémon games continue to sell so well: you might expect Pokémon to focus its attention on children, but in fact a lot of its current success is based on sales to adults who were fans from the beginning. And I think the fact that Twitch Plays Pokémon received so much attention reflects just how ingrained “Pokémon” is in the culture of people who grew up with it.

Microsoft’s Meme Mayhem

The summer of 2013 was not a fun time for Microsoft. In May of last year, Microsoft unveiled their new video game console, the Xbox One. In addition to the terrible name, the console had received a huge amount of criticism from gamers due to some features that Microsoft had claimed were going to be implemented once the console hit the markets in November. These included:

1. An always online requirement – the Xbox One would have to be connected to the internet in order to play it (even if the game was played offline).

2. A once every 24-hour online check – Microsoft would require users to have the system connected to the internet at least once a day or risk losing access to their console.

3. All games would have an online DRM check – this meant that you could only play a game on your specific console, and no one else could borrow or buy your used games.

4. The Kinect would not only be required, but couldn’t be turned off – the Kinect is Microsoft’s camera system used with its video game consoles, which also uses voice commands, so having an always-on camera that was constantly listening for your voice was pretty creepy, especially combined with the online requirements.

So as you can see, gamers were pretty angry that Microsoft was pushing a console that would effectively alienate anyone without a stable internet connection, prevent people from sharing or selling their games, and possibly be watching you 24-hours a day. But what happened next was quite interesting – gamers started using memes to convey their feelings about the Xbox One:

 Meme9 1016526_521214227944798_2024875038_n 922769_515903801809174_28716175_n 401067_510382262361328_1230778048_n

On top of this, Sony, which is Microsoft’s main competitor in the gaming market, quickly realized how much trouble Microsoft was in, and used the controversy to deal some huge blows to their archenemy by stating that the Playstation 4, their own new console, would not have such features (watch the video, it’s hilarious how direct they were with their statements):

And gamers cheered them on through memes:

Meme12 401866_510036942395860_1120182257_n Meme15 Meme14 166015_510382395694648_1216191211_n

Now you might think that things couldn’t get any worse for Microsoft. But there was something else brewing during the summer of 2013 that would prove to be even worse…

Edward-Snowden-008

Yep, the NSA scandal happened around the same time that Microsoft decided to tell its consumers that it wanted them to buy an always online gaming device that would constantly watch them through a camera that couldn’t turn off. Talk about a complete lack of kairos.

Things got so bad for Microsoft that when analyzing pre-order sale statistics, users found that Amazon pre-orders for the Xbox One were way below that of the Playstation 4. So what did Microsoft decide to do? They announced that they removed almost all of those features from the Xbox One.

Yes, memes can change the world.

Touch Screen Devices and Digital Rhetoric

Not many people realize this, but touch screen devices didn’t just magically appear with the 2007 iPhone. There had previously been many attempts at creating a device or computer that didn’t require a keyboard. But for the most part, they have failed. I mean seriously, look at this picture of the HP-150 from 1983:

hp150

Now I suppose that was pretty cool for the time, but given the fact that the operating system and interface had pretty much no difference from the standard computer of the 80’s, and the fact that the sensors on the screen would get really dusty and start failing, there was really no point in using a touch screen other than trying to convince people that you believe there is someone living inside the monitor and that the only way you can communicate with them is through poking the screen.

Of course, the 90’s brought the advent of the palm pilot, which I suppose was relatively popular. But again,the OS was usually bland, had no real difference from the rest of the computers of the time, and used…a stylus. Ugh.

In 2002, Microsoft actually started running Windows XP on tablet PC hardware. Here are some examples:

Tablet 00001549-a

As you can see, the design was really clunky, and had no benefit compared to the laptops being sold during this period. Again, the operating system and interface seemed to be literally copied from the normal version of XP,and they used…a stylus. Ugh X 2. Soon manufacturers were like “OH DEAR GOD WHAT HAVE WE DONE” and realized the monster that they created would come to destroy the world through mediocrity, so they eventually stopped their production.

Now in 2007, when the iPhone was revealed, what did Apple do differently? For one, they started small – applying their new design to a phone before creating a tablet (which I think was a smart idea). But more importantly, they made an entirely new operating system and interface  JUST FOR THE DEVICE. This would later be known as iOS. And this interface was leagues above the typical phone OS of the time. So let’s examine how iOS changed the game when it comes to digital rhetoric.

First, Apple decided that the best stylus in the world was our finger (yay!). The icons, digital keyboard, and app designs were all made to be used by one finger or thumb. This radically changed the way developers designed their software. Apps, especially games, had to be created in such a way that all you had to use were your fingers, otherwise they wouldn’t be able to run on the iPhone. Thus, games like Angry Birds, which uses finger motions in order to give a nuanced sense of physics, were formed out of Apple’s hands-on approach (pun intended).

tumblr_mqy5giKdeT1rrgmwto1_500

 Yet the most important feature of the iPhone, in my opinion, was the fact that the interface was incredibly fast. Attempts at creating touchscreen devices before the iPhone failed not simply because they were practically pointless, but because they were also incredibly slow and prevented the user from enjoying the device. With the 2007 iPhone, the OS was really quick compared to pretty much every phone on the market. But what does speed have to do with digital rhetoric?

Well the fact that the OS and interface were really fast meant that the entire device responded to your natural movements and was therefore intuitive – you don’t need to be told how to use it. In fact, Apple has never packaged actual instructions for their iOS devices. Therefore, both the design and content of apps on the iPhone had to utilize its intuitive nature, or they wouldn’t be allowed on the device. Even the standard apps on previous Apple devices, like iTunes, were completely redesigned for the iPhone. Its speed allowed for quick scrolling, zooming, and sliding, removing the need  for excess buttons and supported the whole “one finger” design. This made the OS look clean and very attractive.

Of course there are other features that have probably influenced the way people present material on the iPhone, such as limiting physical buttons and giving people the ability to program apps themselves. But the speed of the device, combined with the interface, created a standard for touch screen devices we can see across tablets and phones today. Websites, games, and apps on tablets and smartphones are now all designed to maintain this intuitive nature.