The Push-Pull Between Author and Reader

 

… not the opening you were expecting? Between the 16th and 19th centuries, this was how poets, novelists and essayists addressed their audiences (Dorner 1993). The writer had to ‘woo’ the reader from the very beginning and there was an expectation that once they began reading, they would remain faithful and loyal, staying with the reader from the beginning of the text until the very end.

My, oh my, how the author-text-reader relationship has changed. We now exist in a society of textual ‘users’. With masses of information digitally available to us on any one subject, we skim, we study, and we scan until we find something that is of use to us –a notion entirely relatable to me as a university student. Dorner (1993) says that the author has adapted and now greets its audience with a “Look, I know you haven’t got time to fall in love, so I’ll inject you with my ideas as quickly as I can”, and that instant gratification suits our generation of readers just perfectly.

The digital media environment has also created a new ‘use’ of texts, as the notion of interactivity becomes increasingly prominent. With the ease of cut and paste functions, audiences can now co-participate, re-sequence and interactively transform texts how they see fit.

Take music, for example. It was once composed and produced for the audience to listen; the writer wrote, and the audience listened. Simple. Nowadays, when a piece of music is produced, not only is it consumed, it’s expected to be re-mixed by other producers or covered by different artists, it’s played alongside video productions and it’s transformed into digital ring tones. Cover (2006) suggests that this audience interactivity with texts redefines and blurs the traditional author-text-reader relationship even further.

Intellectual property is problematised in the online environment and texts are now valued for their commodification, rather than for what they truly are. Ultimately, we can see how the digital arena has transformed the notion of authorship since the days of ‘Dear Reader’ with audience respect and loyalty.

References:

Cover, R 2006, ‘Audience inter/active: Interactive media, narrative control and reconceiving audience history’, New Media & Society, vol.8, no.1, pp.139-158.

Dorner, J 1993, ‘When readers become end-users: Intercourse without seduction’, Logos, vol.4, no.1, pp.6-11.

Cyborgs: A Fictional Reality

‘Chiba. Yeah. See, Molly’s been Chiba, too’. And she showed me her hands, fingers slightly spread. Her fingers were slender, tapered, very white against the polished burgundy nails. Ten blades snicked straight out from their recesses beneath her nails, each one a narrow, double edged scalpel in pale blue steel. – Gibson (1988)

What you’re reading is an excerpt from William Gibson’s (1988) ‘Johnny Mnemonic’, a piece of cyberpunk literature depicting a science fictional world. Here, each human body’s surface and organic structure has been technologically manipulated and enhanced, creating ‘cyborgs’ with unimaginable strength and supremacy. While Molly has retractable razor blades built into her hands, others have the teeth of a Doberman or sophisticated inbuilt information storage systems.

The ‘Johnny Mnemonic’ world seems as far fetched as science fiction comes, but when we look a little closer into the world around us, the technology is in fact already here and the cyborg is not so much of a fictional creation after all.

Tomas (2000) suggests that Gibson’s fictional world is slowly becoming a very real part of contemporary existence and says that given recent advances in information technology, genetic engineering and nano-technology, changes like these will soon encompass the human body and its sensorial architecture.

In our world of rapidly evolving technologies, the human body is increasingly open to technological enhancement. We’ve given super-human vision to the colour-blind, developed high-tech prosthetic limbs, inbuilt computer chips and information storage devices, developed a cybernetic piece of living tissue, and forged a bio-hacking phenomenon. We are now seeing the gradual merging of man and machine, which are creating capabilities that far exceed typical human functions.

So instead of viewing cyberpunk literature as radical science fiction, perhaps we should inspect these texts as theories of the future, giving us an insight into human evolution and determining which body alterations will provide the best competitive edge in a prospective cyborg world. If you could make one alteration to your wiring or physical structure to create your ultimate cyborg self, what would it be and why?

eye

References:

Gibson, W 1988, ‘Johnny Mnemonic’, Burning Chrome, Grafton, London, pp.14-36.

Smart, S 2010, Cyborg Eye, image, Bike Rdr, viewed 23 March, http://blog.bikeridr.com/2010/03/advantage-cyborgs/

Tomas, D 2000, ‘The Technophilic Body: On Technicity in William Gibson’s Cyborg Culture’, in Bell, D & Kennedy, B (eds.), The Cybercultures Reader, Routledge, London, pp.175-189.

#Selfie Obsessed or Self-expressed?

A little over 40 years ago, a small indigenous community of Papua New Guinea had their photographs taken and shown to them in Polaroid form. These people struggled to interpret them at first, but as they began to recognise themselves, their stomachs trembled and their faces filled with fear, as the “terror of self-awareness” set in (Wesch 2009).

Today, in a stark contrast, we exist in a world filled with self-portraits and generations obsessed by them. The selfie is taking over and it’s safe to say that the terror of self-awareness has undoubtedly diminished. Many suggest that the reason we see these mirror shots, pouting teens and sexually suggestive poses is because they’re showing us how much they love themselves and they want us to hit the “like” button to reinforce this claim (Nelson 2013). The explosion of these photographs is seen as a token of our unusually narcissistic society (Saltz 2014) and it seems so simple to write these selfie-posters off as proud and self-absorbed.

click for image source - Goodger, L 2013, Instagram profile

One of my friends is ‘that girl’. You know the one, she lives for the prospect of showcasing her latest duckface (with the #newlipstick) and sees every mildly exciting event as an opportunity to check in to Instagram. She’s got a reputation for it and has undoubtedly been unfollowed and unfriended many a times for the repetitive clogging of news feeds. It would be simple to label her vain and self-obsessed, however she actually exhibits the lowest self-esteem out of anyone I know. So is the selfie a little more complex than what many first assume?

Every reasoning behind self-documentation is independent of its own creator. Indeed, many post a selfie for validation, however, it seems wrong to typify every self-photographer with the same motivation and intent. What about a selfie taken in protest or a selfie taken to bring a smile to the face of others?

Selfies are the ultimate self-expression and since when did we become a world to hate upon self-expression, rather than embrace it? Like it or not, the selfie is shaping society as we know it and it’s here to stay. As far as I’m concerned, take advantage of that front facing camera and #self-express your little hearts out.

References:

Nelson, O 2013, Dark undercurrents of teenage girls’ selfies’, The Age, 11 July, viewed 22 March.

Saltz, J 2014, ‘Art at Arm’s Length: A History of the Selfie’, Vulture, 27 January, viewed 22 March.

Wesch, M 2009, ‘YouTube and You: Experiences of Self-Awareness in the Context Collapse of the Recording Webcam’, EME, vol.8, no2, pp19-93.

Who is Empowered? Ourselves or the ‘Things’?

What if I told you that you could use the internet to track your elderly grandmother’s movements to ensure that she was alive and well? Or that you’re near-full garbage bin could signal to the garbage depot that it’s time to stop by for a collection? What if you’d forgotten to take your daily medication and a sensor in the bottle itself could send you a friendly text as a reminder? To me, these applications of technology seem like an odd combination of the Truman Show and sci-fi madness, when in reality these particular connections already exist.

These relationships rely on what has been termed the ‘Internet of Things’, where internet connections are bringing previously passive objects to life through the use of networked sensors and RFID tags. Bleecker (2006) refers to these connected objects as Blogjects – blogging objects – and suggests that once the objects are connected to the internet, they become enrolled as active participants contributing to social exchange and conversation.

In a society alarmed by the marketing data collection of platforms such as Facebook, we can be assured that the internet of things is at the heart of privacy and security concerns.

With these networked sensors and tags already finding their way into cars, household appliances and clothing for tracking and monitoring purposes, consider what digital footprints are being left behind by consumers. As Bleecker (2006) puts it ‘where our blogjects go, someone always knows’.

No longer will it simply be our age, postcode, and comparably trivial private information that is available as data, it will be our travel routes and destinations, the time we leave for work and arrive home, what we have for dinner, when, with whom and so on and so forth.

So what happens if this data falls into the wrong hands? What happens when the human population begins to behave differently when our every move is being monitored by physical objects in our homes, in public and out of our direct control? Who exactly are we empowering by these connections, ourselves or the ‘things’?

References:

Bleecker, J 2006, ‘Why Things Matter’, A Manifesto for Networked Objects

 

A Battle of Philosophies: Why Android Will Triumph Over iOs

The battle between the two futures of the mobile net is raging and echoes the PC war of the 1990’s. As mobile connectivity is set to take precedence over desktop connectivity, this current battle is of equal importance. Call it what you will – it is the wage of war between Apple and Google or iOs and Android.

 

Personally, the choice between the two adversaries is not an easy one and the two dissimilar products feel so equal and adequate in terms of benefits and appeal.

In the final quarter of 2012, Android had secured 70% share of global smartphone sales, versus 21% for iOs. In statistics surrounding tablet choice however, iOs took the lead in 2012 with 53% share, leaving Android trailing close behind at 42% (McCracken 2013).

The battle between the two comes down to the contention of their creators’ philosophies. Steve Jobs created Apple with his core business model being based upon closed devices. Apple wanted control over not only the platform itself, but also the platform’s content and the consumer’s use of it.

Co-founder of Google, Larry Page, had a different plan. Through the purchase of Android, the company’s core products emphasized the flow of information and connectivity. Users could not only alter their devices how they saw fit, they could write their own software, using technologies in unpredictable new ways. With such connectivity and freedom, Android’s benefits are undeniably clear in an increasingly connected world.

So who will triumph? If current consumer statistics are anything to draw from, the future of mobile connectivity points to open and generative Android platforms. Perhaps we should consider the opposite occurring – a public shift towards iOs based on ease of use and aesthetics, leaving Android to the gamers and tech geek developers. It’s likely that Google will triumph in either situation. Why? For those consumers that opt for iOs, Google’s search engine is still likely to be their first point of call and as Daniel Roth (2008) explains, ‘if the only thing Android achieves is getting more people to spend more time online, then Google still profits. More users mean more people viewing pages with Google ads’.

Without a change in ideology, the longevity of iOs looks bleak. Put so eloquently by Derrick Brown (2013), ‘Android seems to be growing into the worm that eats the Apple’s core’.

References:

Brown, D 2013, The Epic Battle Between Apple & Google is All But Over – Who Won?, Read Write, weblog post, 17 May, viewed 20 October 2013.

McCracken, H 2013, Who’s Winning? iOs or Android? All the Numbers, All in One Place, Time, weblog post, 16 April, viewed 20 October 2013.

Roth, D 2008, ‘Google’s Open Source Android OS Will Free the Wireless Web’, Wired, 23 June, viewed 18 October 2013.

A Reflection on Space, Place and Media Audiences

For the past nine weeks, I have been creating content on my blog site for a Media and Communication University subject, BCM240, which explores the role that space, place and locality play in the understanding of media audiences.

The act of learning played out quite differently to other subjects generally experienced in my time at University. There were no set readings and we weren’t spoon-fed information. Instead we were supplied with a topic, required to hunt for relevant sources that intrigued and attracted us, and then needed to write about our findings and our attitudes towards these in our weekly blog posts. Being the perfectionist that I am, the search for sources meant quite a lot of reading, but it was reading that I was interested and engaged in, so the time wasn’t really an issue and I feel as though I now have a greater understanding of media audiences because of this.

I’m going to be quite sentimental and suggest that what I have valued most about the subject is the rich conversation that arose from the research I conducted with my parents. To gain insights into historical cinema and television experiences and spaces, my mother, father and I sat together at the dining table with a bottle of red wine, long into the night, as they shared with me their precious nostalgic memories of their first experiences with cinema and television. It gave me an awakening insight into their childhood and reminded me that my parents actually lived their own lives before my brother and I were on the scene, a time which I know so little about. The results of these nostalgic moments were the posts ‘Memoirs of Early Television: Fried Rice Fridays and Cellophane Improvisation’ and ‘A ‘Flea Shed’ Cinematic Experience’, possibly two of my favourite posts for this very reason.

My all-time favourite post, however, would have to be the one that generated significant online and offline conversation. It was my very first post for the subject, ‘iLoo? Guilty’, which explored the notion of media consumption in the most private of spaces, the bathroom. With 32 views, 5 likes, 10 poll participants and multiple favourites on Twitter, it was undoubtedly my most successful post for the subject so far in terms of readership. I attribute this mainly to the controversial and comical nature of its content. Due to its popularity I’m hoping to explore the subject further through my digital storytelling project over the coming weeks.

My readership statistics could have benefited from regular promotion on Twitter, as my tweets were quite sparse over the course of the subject. I believe that promotion on my personal Facebook would have also lead to increased hits, but first I need to overcome the fear of university-outsiders reading my blog content before I commit to such sharing – which is something I need to work on.

Although I did ‘like’ and leave feedback on the blog posts of other students, I could have commented more frequently and over a greater variety of posts, even on the work of those not in the BCM240 subject. Gardner (2012) suggests that effective and thoughtful commenting not only contributes to the conversation, but also leaves your ‘digital footprint’, whereby other readers and writers can find their way to your blog, creating traffic and ultimately increasing readership.

Although the production of weekly blogs has been challenging, it has also been a rewarding experience and I feel as though I have created an aggregation of posts discussing media audience motivation, behaviour and experiences which I am quite proud of. Through historical and modern explorations of place, space and locality, I have gained a deeper insight into media audiences and the true power of new media technologies.

References:

Gardner, B 2012, Why You Should Leave Blog Comments on Blogs, Brian Gardner, 21 May, viewed 26 September 2013, http://www.briangardner.com/blog-comments/

Hacktivism and the Crime of Curiosity

Hacktivism involves the similar disobedience and protest of activism, simply relocated into the online environment where skilful hackers do what they do best with the aim of promoting free speech, supporting human rights and exposing corruption. Groups of hactivists are responsible for denial of service attacks online, information theft, data breaches, web site defacement, typosquatting, and other acts of ‘digital sabotage’ (Paganini 2013). 

Whether you love, hate, condone or support him, Julian Assange is the perfect example of the term. Wikileaks was designed to benefit society by exposing top-secret information using his followers’ and his own tenacious hacking skills. In a networked society where information is free and hacker curiosity and confidence is high, the hacktivist culture is thriving and over the recent decades we have seen an influx of groups such as Omega, AntiSec and Anonymous causing a fluster online.

In the online environment, where can we draw the line between cyber crime, hacktivism and pure cyber nuisance? The argument of hacktivism and crime is a point of wide debate. Many argue that channels already exist for free speech, and that hacktivism simply causes damage in the online world, while others insist that it is purely a form of protest and should therefore be protected.

If those in power do not have the security over their information to withstand a hacktivist attack, I say let the lesson be learnt for whatever is it they have to hide. In a networked online world, information is free and skilful curiosity should not be labelled as crime.

I’ll leave you with an eloquent piece of writing from one of the earliest hackers, The Mentor, written in 1986 shortly after his arrest:

We explore… and you call us criminals.  We seek after knowledge… and you call us criminals.  We exist without skin color, without nationality, without religious bias… and you call us criminals. You build atomic bombs, you wage wars, you murder, cheat, and lie to us and try to make us believe it’s for our own good, yet we’re the criminals.

Yes, I am a criminal.  My crime is that of curiosity.  My crime is that of judging people by what they say and think, not what they look like.

My crime is that of outsmarting you, something that you will never forgive me for. – The Mentor (1986)

References:

Paganani, P 2013, ‘Hacktivism: Means and Motivations … What Else?’, Infosec Institute, viewed 2 October 2013,  http://resources.infosecinstitute.com/hacktivism-means-and-motivations-what-else/

The Mentor 1986, ‘The Conscience of a Hacker’, Phrack, vol.1, no.7, viewed 29 September 2013, http://www.phrack.org/issues.html?issue=7&id=3&mode=txt

Mobile Technologies as Unconscious Distractions

It’s clear that mobile telephones are revolutionary in their role of technological convergence. They transport many traditionally physical elements into our very own palms like the clock, calendar, camera and the Weather Man. But it’s the more unique applications within these devices that intrigue me and the places and spaces in which they transport us to.

I call them our distraction devices. Instagram distracts from a boring train ride, SnapChats from our seemingly never ending piles of assessments, and text messages from the people around us, like the guy set up out the front of Subway trying to steal all of our pennies for Surf Life Saving. These devices transport us from previously monotonous public situations, to our own private realms where and when we choose. But the question stands, is this distraction even a choice anymore?

I just returned from a beautiful trip up the coast, to a camping ground called Pebbly Beach, just north of Coffs Harbour, NSW. When we arrived at the destination, I sign greeted us:  ‘No Running Water’. I tried not to think about it but I knew what that meant – no toilet and no shower for five days. However, it didn’t phase me too much. Something much more shocking was to come.

After the 20 kilometre dirt road and creek crossing we arrived Pebbly. It was absolutely stunning. Much to my boyfriend’s disgust, I reached straight for my phone and snapped a photo on Instagram. Distracted. Then all of a sudden I noticed those hated tiny symbols at the top of my handset. I had no reception and no 3G coverage. No way of making the folks working back home jealous of my adventurous vacation up the coast. I was undeniably upset.

I switched my phone of for the first four days of the trip and became somewhat empowered by the freedom of living in the moment with no distractions. On the fifth day, however, I cracked. I went for a run at 5am along the surrounding beaches, until I stumbled across some reception and my phone went into an acoustic frenzy. What followed was an overload of Instagram posts and a strange feeling of relief deep inside of me.

insti

Mobile phones are becoming an extended part of our beings, transporting us and ultimately distracting us. Facebook in lecture theatres, phone calls in restaurants, text messaging whilst driving, Instagram on remote beaches– this distraction is everywhere and the majority of it is no longer premeditated.

Twitter’s Transformation of Media and Democracy

On the odd chance that some readers aren’t entirely up to par with the social media landscape, let’s begin with a brief rundown of the wonderful world of Twitter. Twitter spreads information. It is a micro-blogging social media platform that allows its users to post ‘tweets’ of 140 characters or less to their audience of ‘followers’, which  are those individuals who wish to subscribe to them and consist of anyone from friends, family and colleagues to complete strangers.

Since the platform’s first emergence on the world wide web in 2006, it has undergone quite substantial alterations in functionality, many of these being the result of user innovation. In 2007, Twitter users began using the ‘hashtag’ (#) to tag content of tweets. Other users could then search for that hashtag which aggregated all other micro blogs discussing the same issue or event. Due to its driving popularity, the hashtag was incorporated as a fundamental aspect of Twitter’s successful business model and is now one of its iconic features. Users further developed the use of the @ symbol to reply and communicate to other users, and have also managed to find ways to overcome the tiny 140 character limit by including links to outsourced content (Johnson 2009).

Generally, one 140 character tweet alone is quite insignificant. But as these tweets connect and intertwine, they develop power and begin to add up to something truly substantive. Johnson (2009) likens their cohesive power to a ‘suspension bridge made of pebbles’. A pebble on its own is useless, but when hundreds are combined, they function with collective power. Together, users of Twitter are creating public conversation and a collective intelligence that is reshaping traditional media and democracy.

The growth of the Twittersphere has seen immense effects on the media industry. Traditional published media content is filtered by a gatekeeper who decides what gets published to audiences, and when. This information is therefore subject to bias and personal agendas (Bruns 2009). New media forms, however, have abolished the role of the gatekeeper as no information filters exist. Twitter has supplied users with a public arena in which they can share real-time news and their otherwise silenced stories. We are only just beginning to see the sheer power of these combined tweets (or pebbles) through movements like the Arab Spring.

Twitter Revolution

Twitter has being used by activists as a powerful citizen journalism tool to bypass government restrictions and expose underground civilian communities, assuring people of the Arab world that they are not alone and that others are experiencing similar brutality and injustices brought about by those in power. The platform has also been used to organise protests of thousands through the use of hashtags, and together these activists have ultimately toppled powerful dictatorship (Kassim 2012).

It is safe to say that Twitter alone does not overthrow governments – courageous people do. It has, however, assisted in user communication and coordination. It is the sheer power that these media tools provide users with that must be noted. Audiences are no longer simply consuming the information that is fed to them from industrial media. Instead, they are producing and broadcasting their own stories and opinions through social media, with unmuted voices and combined efficacy.

 

References:

Bruns, A 2009, ‘News blogs and citizen journalism: new directions for e-journalism’, in Prasad, K (ed.), e-Journalism: New Media and News Media, BR Publishing, Dehli. 

Johnson, S 2009, ‘How Twitter Will Change the Way We Live’, Time Magazine, 5 June, viewed 21 September 2013, http://content.time.com/time/magazine/article/0,9171,1902818,00.html

Kassim, S 2012, Twitter Revolution: How the Arab Spring was Helped by Social Media, PolicyMic, weblog post, 3 July, viewed 21 September 2013, http://www.policymic.com/articles/10642/twitter-revolution-how-the-arab-spring-was-helped-by-social-media

Urban Screens: Wasted Space or a Step in the Right Direction?

The television screen was once a technology that privatized the public sphere, bringing civic engagement into domestic space (McGuire 2006). As contemporary audiences are increasingly engaged in their own private smartphone and laptop screens, these traditional mass media channels have begun to struggle to grasp audience attention and are now developing new ways of reaching them. Cue the proliferation of media screens in public spaces…

We are beginning to see the reverse effect of initial television: the private space is now relocating itself within the public arena. Since the mid-1990s, screens have emerged worldwide on unlikely urban surfaces and can now be found anywhere from taxis, gyms and restaurants, to doctor surgeries, festivals and shopping malls. They display anything from advertising and promotional material to television and film content. McGuire (2011) suggests that the very content that they display is often associated with their demise and that with the density of the displays, also comes their diminishing impact.

At the University of Wollongong these urban screens are abundant. They sit balanced on walls and fittings surrounding the main thoroughfares on campus, the library entrance, popular cafes, eateries and common entry and exit points. But the question stands, is anyone truly watching?

screens

These screens have found their place in my eyes as visual pollution, of which my subconscious inevitably removes from processing. I have passed these screens on a daily basis for almost two years now and have never stopped to appreciate what they are trying to tell, or sell, me. I can assume that like most methods of public screening, they host advertising material or content unrelated to me. In my mind, they may as well be switched off. If these screens are to succeed in capturing the attention of distracted and time-scarce students and academics on campus, they need to evolve in order to take advantage of their infinite abilities instead of simply recreating traditional advertising billboards with the added effect of high definition movement.

Interactivity is the key. With the roll out of the National Broadband Network, high speed internet connectivity is making this interactivity increasingly possible (McGuire 2011). Take a look at Coca-Cola’s use of their iconic digital billboard in Kings Cross during the recent ‘Share a Coke’ campaign, involving streetwalkers in new and exciting ways. RayBan uses interactive screens so that users can virtually try-on different sunglasses and Toyota’s ‘Vision Wall’ has taken touch screen technology to new levels. These interactive screens all have obvious hidden marketing agendas, but audiences seemingly don’t mind. Instead, they are actively contributing to the ‘buzz’ of the campaign by circulating and sharing content on social networking sites. These alternative spaces are also fostering new relationships, as they bring participating audiences together to interact.

So I pose the question to University of Wollongong students and academics, what do you want to see on the screens located around campus? Personally, I would like to see an interactive map; I still get lost at the start of each session trying to find buildings I haven’t seen before and I can’t even begin to imagine how an overwhelmed one-off visitor would navigate around campus. If we manage to integrate audience interactivity into the screen content, no doubt the University will be moving one step in the right direction.

References:

McGuire, S 2011, ‘Networked Urban Screens and Participatory Public Spaces’, Telecommunications Journal of Australia, vol.61, no.4, pp64.1-66.10.

McGuire, S 2006, ‘The politics of public space in the media city’, First Monday, vol.4, http://journals.uic.edu/ojs/index.php/fm/article/view/1544/1459