Optimism about Journalism

You hear them all the time, public sneers about the journalistic future; ‘Journalism is on its way out’, ‘journalism’s dead’ and ‘citizen journalism is where the future lies’. As a media and communications student, the same opinions echo through the voices of influencers around me, with comments like ‘oh you’re doing journalism subjects, why is that?’ and ‘media and communications hey? You’re not majoring in journalism are you?’ The truth is, I’m not. In fact, those influential voices the widely discussed declining statistics of mainstream media succeeded to turn me off even contemplating it as a major of study.

In an interview with David Carr from The New York Times discussing the future of journalism, he suggests that journalism ‘back in the day’ really wasn’t that great and that the changes that we are seeing due to technology today are all a part of a natural evolution (Boston University 2014). In the same interview, Andy Lack from Bloomberg media suggests that the changes that are occurring due to the rise of digital media should be expected, enjoyed, used and discovered instead of being feared. He even goes as far to say that we are in the ‘golden age of journalism’ due to these advances (Boston University 2014).

So is the future of journalism really looking as bleak as many assume? People are now consuming news their way. We can see by looking at the platform Twitter that keeping up with such behavioural preferences if often at the core of developing successful business plans. Twitter was designed for individuals to send out tweets in a one-way form. However, its users wanted more. They wanted to talk, they wanted to reply to other tweeters and they began using the @ handle to direct their conversation. Twitter incorporated this concept of two-way communication using the @ handle into the very core of its offering and, as a result, it is now one of the most successful social media sites on the planet.

Tom Rosenstiel, Director of the American Press Institute, in a recent TED Talk spoke of new media and user-generated content challenges when stating that ‘what disrupted us will now begin to save us’ (TEDx 2013). He suggests that the notion that people are turning away from the news is simply not true, even though statistics around the decline of traditional media suggest this. News is on demand, but the audiences are now simply online and consuming news very differently.

The journalism landscape is evolving, just as it has done in the past. If journalists can offer news to audiences how they want it, where they want it and when they want it, then success is undoubtedly warranted. People place value in reliable information and trustworthy sources. In the vast sea of information available, if audiences were offered news ‘their way’ from both a novice writer and a respected journalist, then it would be almost certainly assumed that they would choose the journalist. It’s all about keeping up and I believe that we are going to begin to see the most adaptive media corporations begin to prosper once again.

References:

Boston University 2014, NYT’s David Carr on the Future of Journalism, online video, 6 March, Boston University, viewed 17 March 2014, https://www.youtube.com/watch?v=WPlazqH0TdA

TEDx Talks 2013, The Future of Journalism: Tom Rosenstiel at TEDxAtlanta, online video, 28 May, TEDx, viewed 17 March 2014, https://www.youtube.com/watch?v=RuBE_dP900Y

Advertisements

Cyborgs: A Fictional Reality

‘Chiba. Yeah. See, Molly’s been Chiba, too’. And she showed me her hands, fingers slightly spread. Her fingers were slender, tapered, very white against the polished burgundy nails. Ten blades snicked straight out from their recesses beneath her nails, each one a narrow, double edged scalpel in pale blue steel. – Gibson (1988)

What you’re reading is an excerpt from William Gibson’s (1988) ‘Johnny Mnemonic’, a piece of cyberpunk literature depicting a science fictional world. Here, each human body’s surface and organic structure has been technologically manipulated and enhanced, creating ‘cyborgs’ with unimaginable strength and supremacy. While Molly has retractable razor blades built into her hands, others have the teeth of a Doberman or sophisticated inbuilt information storage systems.

The ‘Johnny Mnemonic’ world seems as far fetched as science fiction comes, but when we look a little closer into the world around us, the technology is in fact already here and the cyborg is not so much of a fictional creation after all.

Tomas (2000) suggests that Gibson’s fictional world is slowly becoming a very real part of contemporary existence and says that given recent advances in information technology, genetic engineering and nano-technology, changes like these will soon encompass the human body and its sensorial architecture.

In our world of rapidly evolving technologies, the human body is increasingly open to technological enhancement. We’ve given super-human vision to the colour-blind, developed high-tech prosthetic limbs, inbuilt computer chips and information storage devices, developed a cybernetic piece of living tissue, and forged a bio-hacking phenomenon. We are now seeing the gradual merging of man and machine, which are creating capabilities that far exceed typical human functions.

So instead of viewing cyberpunk literature as radical science fiction, perhaps we should inspect these texts as theories of the future, giving us an insight into human evolution and determining which body alterations will provide the best competitive edge in a prospective cyborg world. If you could make one alteration to your wiring or physical structure to create your ultimate cyborg self, what would it be and why?

eye

References:

Gibson, W 1988, ‘Johnny Mnemonic’, Burning Chrome, Grafton, London, pp.14-36.

Smart, S 2010, Cyborg Eye, image, Bike Rdr, viewed 23 March, http://blog.bikeridr.com/2010/03/advantage-cyborgs/

Tomas, D 2000, ‘The Technophilic Body: On Technicity in William Gibson’s Cyborg Culture’, in Bell, D & Kennedy, B (eds.), The Cybercultures Reader, Routledge, London, pp.175-189.

A Battle of Philosophies: Why Android Will Triumph Over iOs

The battle between the two futures of the mobile net is raging and echoes the PC war of the 1990’s. As mobile connectivity is set to take precedence over desktop connectivity, this current battle is of equal importance. Call it what you will – it is the wage of war between Apple and Google or iOs and Android.

 

Personally, the choice between the two adversaries is not an easy one and the two dissimilar products feel so equal and adequate in terms of benefits and appeal.

In the final quarter of 2012, Android had secured 70% share of global smartphone sales, versus 21% for iOs. In statistics surrounding tablet choice however, iOs took the lead in 2012 with 53% share, leaving Android trailing close behind at 42% (McCracken 2013).

The battle between the two comes down to the contention of their creators’ philosophies. Steve Jobs created Apple with his core business model being based upon closed devices. Apple wanted control over not only the platform itself, but also the platform’s content and the consumer’s use of it.

Co-founder of Google, Larry Page, had a different plan. Through the purchase of Android, the company’s core products emphasized the flow of information and connectivity. Users could not only alter their devices how they saw fit, they could write their own software, using technologies in unpredictable new ways. With such connectivity and freedom, Android’s benefits are undeniably clear in an increasingly connected world.

So who will triumph? If current consumer statistics are anything to draw from, the future of mobile connectivity points to open and generative Android platforms. Perhaps we should consider the opposite occurring – a public shift towards iOs based on ease of use and aesthetics, leaving Android to the gamers and tech geek developers. It’s likely that Google will triumph in either situation. Why? For those consumers that opt for iOs, Google’s search engine is still likely to be their first point of call and as Daniel Roth (2008) explains, ‘if the only thing Android achieves is getting more people to spend more time online, then Google still profits. More users mean more people viewing pages with Google ads’.

Without a change in ideology, the longevity of iOs looks bleak. Put so eloquently by Derrick Brown (2013), ‘Android seems to be growing into the worm that eats the Apple’s core’.

References:

Brown, D 2013, The Epic Battle Between Apple & Google is All But Over – Who Won?, Read Write, weblog post, 17 May, viewed 20 October 2013.

McCracken, H 2013, Who’s Winning? iOs or Android? All the Numbers, All in One Place, Time, weblog post, 16 April, viewed 20 October 2013.

Roth, D 2008, ‘Google’s Open Source Android OS Will Free the Wireless Web’, Wired, 23 June, viewed 18 October 2013.

The Evolution of Square Eyes: Google Glass and Convergence

By convergence, I mean the flow of content across multiple media platforms, the cooperation between multiple media industries, and the migratory behaviour of media audiences who will go almost anywhere in search of the kinds of entertainment experiences they want. –  Henry Jenkins (2006, p.2)

When conceptualising Jenkin’s notion of convergence, the foundations of our everyday smartphones cannot be overlooked. Jenkins (2006, p.5) identifies these devices as the electronic equivalent of the ‘Swiss army knife’ and it’s not hard to follow why. They function as a phone, messenger, email, clock, calendar, camera, photo album, notepad, calculator, map, video streamer, internet browser, ringtone maker and… Yeah, you get the point. They are revolutionary, and so are the ways in which users engage with them.

 

The term convergence is not an endpoint, but rather a process of continual innovation (Jenkins 2004, p.34). It is suggested that, in the coming years, we can expect to see the Swiss army knives of our current media environment being adapted, innovated and converged, with ‘wearable’ computing technologies taking the spotlight.

Let’s take look at the wearable innovation that is Google Glass, set to be released to the general public next year. The device hovers just in front of the user’s eyes, worn like a pair of glasses, featuring voice commands and instant responses. Glass takes photographs, records videos, makes calls, sends texts and answers any questions that the user may have. These are all viewed on the tiny projected screen in the top corner of their vision, and looks a little something like this..

Audiences are just as intrigued by the prospect of wearable computing as they were with the smartphone, albeit accompanied by increased skepticism. The inclusion of cameras, video recording, and user-only viewing on such a discrete device is raising obvious privacy concerns and causing a stir amongst media audiences who are perhaps wishing they were born into a simpler technological era.

Currently, there are minimal applications being run on Glass and (let’s be honest) they’re not the most striking pieces of headwear around. However, if history is anything to go by, these obstacles will soon be mastered and overcome. In the coming years, we could expect to have our favourite designer frames fitted out with Glass technology and through progression and driving popularity, we can also imagine that a growing number of apps and features will continue to converge within the same single device. Along with the telegraph, record player, Discman and alarm clock, perhaps the smartphone will soon join the list of obsolete technologies in the media history books.

Jenkins (2006) suggests that with new convergent technologies, comes users’ unpredictable consumption of them. So what can we expect the augmentation of wearable computing to bring? Undoubtedly, we can assume to see changes in norms and interactions on professional, social and intimate levels, just as we did with the smartphone. Smartphones were once an acquired novelty, yet they now frame popular culture. I believe that the technology spans beyond novelty. Google Glass and comparable technologies are simply another leap in the process of our converging media world, and a pretty exciting leap, at that.

Remember the old saying that if you sit too close to the screen that you will end up with ‘square eyes’? Well, Google Glass may play a role in shaping human evolution in the years to come…

squares

References:

Jenkins, H 2006, Convergence Culture: Where Old and New Media Collide, New York University Press, New York.

Jenkins, H 2004, ‘The Cultural Logic of Media Convergence’, International Journal of Cultural Studies, vol.7, no.1, pp.33-43.

iLoo? Guilty.

tumblr_ln46t5Wf1z1qiyojco1_500_large

Okay, so this may be a cringe worthy topic, but I think I need to shine a little light on the matter; I’m talking about media consumption in the most private of all private spaces, the bathroom.

Now if I’m being honest, my smartphone typically doesn’t leave my side. And if I’ve got to go, it comes along with me. I’m not a gamer, Facebooker or Tweeter whilst on the loo, I’m too much of a germaphobe for that. But if I get a text, I will have a read.

Now before you begin to judge, take note that the stats are on my side. A recent survey by Sony and O2 found that a whopping 75% of both males and females use their smartphones whilst on the toilet, with 25% admitting to making a call (Drewett 2013). Furthermore, a quarter of men admit to sitting down to urinate so that they are ‘hands-free’ to continue their smartphone use….. ahhhh?

To seek further reassurance (as not to humiliate myself in front of my fellow bloggers), I decided to carry out a little of my own research via a group iMessage with my closest pals. When asked if they engaged with their iPhone whilst on the loo, their varying responses were amusing and fascinating to say the least:

J: ‘I go on Facebook chat, Instagram, Twitter and check my emails, I do it all. Nothing wrong with that!’
A: ‘Yep! And Candy Crush! That’s a given.’
K: ‘Omg. Never.’
C: ‘I do it all the time!!’
K: ‘Ummm, please tell me that you all Dettol wipe your phones when you’re done?’
C: ‘I said I use my phone Kiaya, not wipe my bum with it.’

… and so the banter continued.

So as it turns out, I’m not alone and media use in the bathroom is gaining momentum. Perhaps it really is time to purchase some Dettol wipes.

When exactly did this method of consumption become acceptable daily practice? Or is it acceptable for that matter? What are your thoughts? Media technologies and our questionable consumption of them have transformed the most private of all spaces into a space in which we are privately yet unavoidably connected. An icky thought to say the least.

References:

Drewett, M 2013, ‘75% of people use their phone on the toilet’, Digital Spy, 8 March, viewed 2 August,  http://www.digitalspy.com.au/odd/news/a464219/75-percent-of-people-use-their-phone-on-the-toilet.html