Twitter: Can trillions of Tweeting Twits be wrong?

I’ll be honest. I’ve never understood Twitter. It seems dumb.

I admit, I’ve got one, (making me dumb by association?) but it was for Uni. That’s right, Media made me get it, I swear.

I first signed up to see what all the fuss was about, and so I could understand my lectures. I followed a few things: CNN, the UN, Coldplay. All the important stuff. And then, I began to get followers. Complete randoms who would add me, and then, after realising I didn’t actually tweet, would quietly delete me. Weird. (I often get mistaken for Matthew Perry, no biggy.)

Since then I’ve managed to cut Twitter out of my password memory. But recently I’ve started again. For Net Comm, just to help stick my blog out there in the big World Wide Web. And while I’m still not a fan, I can’t help but wonder at how popular, and (apparently) useful, it’s become.

For one, look at the recent social media revolutions. Take the Iran election protests. Then the recent Arab Spring. Both instances where people have been using Twitter, amongst other social media, to circumvent government censorship of traditional media to great success. It seems the ‘140-characters-including-spaces’ is truly mightier than the sword.

Of course, being mindless and popular, it’s no surprise celebrities have jumped on the Twitter bandwagon. In fact, celebrities are the biggest twits of all; from Charlie Sheen, who hires someone to tweet for him (he’s probably too busy ‘winning’ with all his coke and pornstars), to Shaquille O’Neal, who recently announced his retirement from the NBA on Twitter.

That’s right, Shaq: 4 time NBA Championship winner, one of the greatest basketballers of his generation. A man loved by millions for his both his post-ups and his personality, and he retires with a measly 11 character tweet. He didn’t even use all of his precious 140. Arrogant.

And so it made me think. If you’re breaking news, you usually want people to hear about it. As many people as possible. Thus, for Shaq to tweet his retirement first; he’s either being a clown, or someone very smart and well paid told him to do so. In the latter more likely case, Twitter’s reach must not only be incredibly widespread, but in fact considered even more so than that of the mainstream media.

Could it be possible? If everyone is not only tweeting, but checking other people’s tweets, then I guess so. It’s a more succinct (and clearly effective) version of Facebook.

I’ve always thought Twitter was for Twits. But the world is changing. And if everyone is a Twit, then maybe it’s time to start Tweeting?

(Still confused as to why it’s so popular? Check this out)

A changing media landscape

Just wanted to post this interesting video. It’s a slick insight into how net communications, and in particular social media, has changed over the past few years.

For those playing at home, it’s one of the many spin offs based on the original “Did You Know” video that does the rounds of the Media and Communications department. I’ll post that one as well for those who haven’t seen it.

Enjoy.

Plundering Pirates or Digital Robin Hoods?

Pirates are thieves. Plain and simple. From Disney to Digital Rights Management, this is what we’ve been taught. Pirates steal the hard work of artists and distribute it for free, leaving them penniless and powerless in the face of an online epidemic.

But are we taking it too seriously?

There is a problem with depriving artists of remuneration for their hard work. Undoubtedly. And I in no way condone the theft of someone else’s work. Yet, often the losses of artists and corporations are exaggerated, and more importantly, it is undeniable that piracy does have its  benefits both for the wider community and the individual user.

Piracy is ultimately motivated by money (or more precisely, the lack of money), however, it also plays a much more important role in society. In fact, piracy forms an integral part of contemporary culture, or more precisely, the distribution and development of culture. Without pirates, many, if not the majority, of society would go largely unexposed to art, music and technology. All over something as trivial as money. Does this seem fair?

The extent to which someone can learn and participate in their culture shouldn’t be based on how much they are able to pay. In any case, isn’t educating the human collective more important than deepening the pockets of greedy celebs and corporations?

South Park sure thinks so. And I tend to agree.

After all, piracy’s not that bad is it?

Piracy's not stealing, it's piracy!

[Image: AttributionNoncommercial Some rights reserved by Travelin’ Librarian]

Cloudy Copyright

You wouldn’t steal a car. You wouldn’t steal a handbag. You wouldn’t…”

Blah, blah, blah. We get it.

You know how it goes. Copyright, piracy, free. They’re all words we’re familiar with, and frankly sick of.

Are you a pirate?

[Image: courtesy Broken TV]

Yet, the recent announcements of various ‘cloud’ music services raise some interesting new points in the copyright debate.

Google recently announced their “Music Beta service“, following in the footsteps of Amazon a few months ago. Apple is expected to follow suit in June.

Both Google and Amazon, however, have released their cloud music services without backing from the four major record labels (EMI, Sony, Warner and Universal). Both companies have refused to impose any copyright conditions on their services. They’re working off the principle that you don’t, and shouldn’t, need permission to store something you already own  somewhere else. All obtained legally, of course.

Fair enough. After all, a ‘cloud’ is just a like a big hard drive, except online. (What is a cloud, you ask? Watch this. Or look outside.)

Yet, Google have also said that it will take action against pirated content. Just what kind of action this will be remains to be seen. (I’m assuming something similar to how it treats pirated YouTube content.) But it’s a precarious balance. Will Google dob in pirates? Will people be too scared to use the Google service at all? And what will Apple do?

The copyright fat cats are used to getting what they want, but then again so is Google. It’s an interesting time for the online-music-technology-copyright world in general. I’m excited.

**UPDATE: Apple have just announced their own iCloud service. With the backing of (soon to be all 4) major record labels.

As expected, Apple is running a very slick operation, both technologically and practically. It looks great, plus iPhones, iPods and iTunes libraries are all ready to be sent to the cloud in a click. The record deals also make things much quicker and easier (online copy matching means no time wasting library uploads) . Plus, it’s legal. (Yay, no jail!) The catch, you have to pay an annual fee to store non-iTunes bought music. (Read: pirated music.)

(For more details check my latest post)

What this will mean for the popularity of Google and Amazon will be interesting to see. One things for sure, copyright’s not dead just yet.**

Public vs. private: will Facebook kill privacy?

Week 5:

Analyse critically the following statement by Mark Zuckerberg while comparing it to privacy issues raised by online social networking collaborative practices:

Mark Zuckerberg’s comments on sharing (start at 0:26—stop at 0:39) (http://www.tubechop.com/watch/146252)

“When people have control over what they share, they’re comfortable sharing more. When people share more, the world becomes more open and connected. And in a more open world, many of the biggest problems we face together will become easier to solve.” (Zuckerberg, 2010)

Is privacy dead?

[Image: AttributionShare Alike Some rights reserved by opensourceway]

Facebook wants to rewrite the rules of privacy. Or rather, erase them altogether.

The above video, however—released in response to Facebook’s disastrous efforts to make users’ default privacy settings more open—shows that Zuckerberg and Facebook still don’t seem to understand what privacy means to their users.

Let’s consider his two main points:

1. “When people have control over what they share, they’re comfortable sharing more.”

Agreed. Privacy is not about hiding everything; it’s about controlling what you share with whom.

On social networks like Facebook, however, this is not so straightforward. People are often ignorant of exactly what they are sharing, and, even more concerning, other people can easily share things about someone else without their knowledge. The recent Brocial Network scandal revealed the extent to which Facebook privacy can be exploited. Since all of the photos had been uploaded to Facebook by the girls themselves, everything the Brocial Network’s members did was technically legal. In fact, the obvious question is: if those girls didn’t want people to look at those photos, then why did they post them in the first place? (A very Eric Schmidt line of thinking). But it’s not that simple. Those girls may have been happy for their friends to stalk their semi-naked photos, but I doubt they appreciated 8 000 other randoms doing so. It’s hard enough to control what you share on Facebook, let alone what anyone else may. Until this can be sorted, it’s ridiculous for Facebook to try and force people to share more.

2. “When people share more, the world becomes more open and connected. And in a more open world, many of the biggest problems we face together will become easier to solve.”

Logical, yet idealistic and impractical.

Sure, a more connected world would be great, and if everyone on the planet was working towards solving the same problem, well, let’s agree that six billions brains are better than one. Yet, again in the context of social networking, this attitude poses problems. If everyone knows: where and when you go somewhere, and what and with whom you do something, that’s not only creepy, it can be dangerous. Online predators and offline stalkers are enough of a problem, without giving them a copy of your diary. (HINT: If you don’t want to be robbed, don’t status update your location.)

Thanks to the Internet, privacy is more important than ever. As Solove says, “reputations are forged when people make judgments based upon the mosaic of information available” (Solove, 2007: 30), and when drunken photos lose Uni degrees, and accidental YouTube stars are actually victims of cyber bullying, it’s hard to argue that decreased privacy will make life easier. Having 500+ friends can make it difficult to remember with whom you are sharing what, and often a little online over-sharing can lead to a lot of offline hair-tearing.

Whoops.

[Image: Courtesy All Facebook]

In any case, people aren’t ready for such an open world yet. Studies by the Pew Research Center, an American think-tank, have shown that in fact young adults are the most conscientious demographic when it comes to online privacy and reputation management. Over 70% of 18-29 year olds admitted having changed their social network privacy settings, to limit what information was viewable by whom (Madden and Smith, 2010). After all, reputation is one of our “most cherished assets”. (Solove, 2007: 30)

People have long been predicting the end of privacy. And now Zuckerberg wants to bring that about by making the world’s social network, well, more social. But if you’re listening Mark, take note:

People still want their privacy. They want it protected. And that needs to be respected.

[Image: AttributionNoncommercialShare Alike Some rights reserved by Florian SEROUSSI]

Words: 551 (excl. Zuckerberg quotes)

References:

Madden, M. and Aaron Smith (2010). Reputation Management and Social Media: How people monitor their identity and search for others online. Pp 2. (http://pewinternet.org/Reports/2010/Reputation-Management.aspx) 26 May [date accessed 16 April 2011]

Solove, Daniel. J. (2007) The future of reputation: gossip, rumor and privacy on the Internet. New Haven: Yale University Press, pp. 30

Zuckerberg, M. (2010) Mark Zuckerberg on making privacy controls simple. Facebook. http://www.youtube.com/watch?v=sWDneu_w_HQ&feature=player_embedded [Video: date first accessed 14 April 2011]

Week Three: What features can you identify in WordPress that define it as a Web 2.0 application?

Considering Tim O’Reilly’s eight design patterns for Web 2.0 applications, it is clear that WordPress is one such application. The four main design criteria displayed by WordPress are as follows:

The Long Tail:O’Reilly suggests that Web 2.0 is primarily composed of sites that cater to niche markets, allowing it to attract an incredibly broad audience. This is a definite feature of WordPress, in which any produser can create a blog that caters specifically to his or her unique interests.

Users Add Value: Clearly a key aspect of WordPress, O’Reilly argues that users must add data to an application for it to grow and succeed. As a blog hosting platform, WordPress relies on users adding to and adapting the software provided for it to even exist. Inevitably some users will add more value than others, for instance those writing and improving software code may be few and far between, but every user contributes in their own unique and useful way.

The Perpetual Beta: O’Reilly describes how Web 2.0 applications are constantly updated and being improved in real-time. This differs from the previous practice of packaging a set version of a product, selling it, and then requiring the user to upgrade at a later date. The nature of the Internet and the WordPress platform means that being in ‘perpetual beta’ is an inherent quality of blog hosting software. Since people do not have to download anything, the product is able to constantly with or without the awareness of the produsers.

Software Above the Level of a Single Device: As more and more people forgo their computers to access the Internet, Web 2.0 apps must be openly accessible to new devices. WordPress software is adaptable to and accessible on a whole host of new generation technologies including: mobile devices, such as an iPhone and tablets, like the iPad.