iPad vs e-reader

•October 17, 2010 • Leave a Comment

Since 13 juli in the Netherlands the battle between the long expected iPad and several e-readers has started. But what are actually the downsides of the iPad as an e-reader or the downsides of an e-reader as an iPad? Let’s find out!

A lot of analyst forecast the launch of the iPad to be devastating for the e-reader marked although e-reader purists do not agree. One big difference between the iPad and an e-reader is the screen. The first uses, according to Apple, the more advanced screentechnology IPS (In-Plane Switching). This technology which originated from the research efforts from Hitachi in ‘96 is known for it’s wide angle view. Several years and generations later, this screen technology became affordable for application in the iPad and gives the user a view angle of 178 degrees. Ofcourse the beter view angle was obligatory since iPad applications are designed for landscape and perpendicular view. The big difference is in fact that most e-readers uses the (still) grayscale electronic ink technology which according to the advocates is easier on the eyes for heavy readers, consumes less battery and keeps the e-reader device light weight. Another difference of the screens is the mirroring and blurring effect of the iPad screen most noticeable when using outside. The most e-ink technology doesn’t have this negative effect from the sun if used in combination with a non mirroring screen surface. The screen of the Nook for instance doesn’t need backlight and uses the exterior light that fals on the screen. It acts simular as a normal piece of paper from which you can read by using the light from the environment. The only downside of the e-ink screens is the lack of color in the e-readers currently sold. E-ink color screens are promised by the manufacturer E Ink to be available in the year 2011.

There are some doubts about the e-ink technology which is known for the slow refresh rate. Although this is also an advantage in comparising with the LCD-screens who are refreshing at the rate of 60 screens per second and therefore are heavy on the eyes, the refresh rate is limiting the functionalities of the e-readers. The slow refresh rates of the e-ink technology is unsuitable for video or internet usage. And there is actually the biggest difference between the e-readers and the iPad. Both devices are different in that they are better used for differend purposes. For instance the iPad’s weight is significant and therefore not really suitable for reading. On the other hand you might say; ‘those big books are read aswell right’ in which you are definitely right but not for eight hours in line. Also the size of the iPad is for some a little bit to big for convenient portability in stead of the Nook who has the size of a paperback and is light weight also. Next to these benefits for the e-reader the battery life is also ten times longer than of the iPad’s which is due to the steady images the e-ink technology screens produces. The iPad can be used for ten hours straight when used as e-reader which is a reasonable timespan for a LCD screens of that size in a mobile device. The e-reader is said to have a battery life of one month which makes himself a good companion during a trip with a steamboot where the iPad will get you through a jetflight without getting bored. In practice the shorter battery life of the iPad will not be a big problem because of the availability of electric outlets in most situations.

eBook shops

In e-reader land there were strong relations between e-readers and ebooks. This is manly because the companies try to bond the consumers witth their ebook online store. For the iPad this means that the consumer was dependent on the offer of the iBooks online ebook store which a while ago set an agreement with the dutch Centraal Boekhuis to sell a lot more dutch subtitles. This means that the total offer of dutch booktitles has increased to 250.000 titles from september this year. Furthermore with the iPad it is possible to use different applications which gives you the apportunity to read ebooks from Nook and Kindle which have 1 million ebooks in catalog respectively 700.000. While these catalogs are typicaly based on english readings, for the more international orientated dutch reader this won’t be a problem. By the way, the prices of the ebooks in different online stores are on average the same.

Subscription

In case you want to use your iPad in combination with internet you need to subscribe to an 3G wirelessnetwork. It is  possible to use the Wifi internetacces in stead but this won’t give you the full freedom of internet usage while travelling. A subscription will cost you in the Netherlands between the 20 and 30 euro’s on a monthly basis and depends on the amount of time you spend on the internet or the amount of data you need for your online activities. The Kindle e-reader is also available for dutch consumers where the a 3G wireless subscription is included in the purchase. The e-ink screens on the Kindle seems to be upgraded with a better contrast and refresh rates but the online experiences between the webbrowsers of the iPad and the Kindle will be substantial keeping in mind the still ’slow’ refresh in comparison with iPad’s LCD screen. The Nook also comes with 3G wireless internetacces but is still not available in europe.

Conclusion

The differences between de iPad and the e-readers mentioned are mainly due to the defining factor of e-readers, the e-ink screens. Although the screens will be continued to developed the functionalities of the e-ink technology is less than that of an LCD screen. Besides, the e-ink screens are relative expensive to produce in comparison with the LCD screens. An iPad used as an e-reader is actually not the most preferred thing to do. The device is way a head of the e-reader when it comes down to surfing the internet, which according to many is where the iPad really delivers what is promised. Instead of using a laptop to surf the internet which arguably gives you the same experience the iPad has the advantage of the size and the weight in comparison with a laptop. Besides that the many applications Apple has developed for the iPad are a big incentive of usage. The intuitive usage of the iPad also helps to make the device popular amongst a broad range of users. Then the price of the iPad from (499 dollar) are higher than for for instance the Amazon Kindle3 which is priced for 139 euro (Wifi-model).

What we can conclude is that the iPad isn’t actually a e-reader at all and an e-reader isn’t suitable for surfing the web. So in this case we are comparising apples with oranges. What can be said is that for the die hard bookreader an e-reader is definitely a device to take in consideration if he is looking for an alternatieve for the paper book because the reading of an e-book from e-ink screens can be called excellent. But for the consumer who will sometime read an ebook or digital newspaper is definitely better of with the iPad because the reading of an e-book from the LCD screen is reasonable enough. I believe the ease of the iPad is way better than the interface of most e-readers mostly because of the touchscreen of the iPad which is still not available for instance on the Kindle3. Also I think the value of the iPad lies in the ‘gadget’ value which isn’t present in most e-readers who’s look and feel is more plastic, less solid. So for the average consumer with enough cash in the pocket and is willing to pay on a montly basis for the internet access of the iPad, the iPad will give more value on the long term.

Research Twitter, Predict the Future

•October 11, 2010 • Leave a Comment

Twitter, for many critics, is seen as the platform on which a lot ‘irrelevant’ information, whether personal or not, is shared. Arguments of the presupposed irrelevance of Twitter are paralleled by critics who argue the possible predictive potential of Web 2.0 and social media in particular. My motivation of writing this post is that social media which is the product of Web 2.0 already is and will increasingly be the tool of choice for disseminating information and the sharing of interpersonal communication. [1] It is therefore important to search for the relevance of this presupposed irrelevant chatter.

Social media platforms are adopted rapidly by a diverse group of people. From politicians to businessmen, in another context friends and family, and ranging from a broad bandwidth of ages these people are disseminating cartloads of opinions, feelings, emotions and their current whereabouts.[2] Studies have shown that Twitter’s call ‘what am I doing?’ in a rhetorical way is bypassed by far more diverse ‘personal’ information. By this insight Cheong et al. draw the conclusion that Twitter is ‘a hybridization between conventional blogging and an online social network’.[3] By the openness of the Twitter platform this means this rich information is free but valuable noticeable by the ‘Trending Topics’ option which gives you insight in the top 10 mentioned terms on Twitter. This service gives you an insight of the so called ‘zeitgeist’ of the Twitter community in reflecting on the daily news and experienced lives. This ‘zeitgeist’ makes the Twitter relevance all the more apparent but still is ‘just’ a mirror of the present sentiments of the offline world aggregated online. Not to downgrade this intelligence but by virtue of the rhetoric question ‘what am I doing?’ maybe I can surprise you and write about how your ‘Tweeting’ is used for predicting real world outcomes aka the future.

Already Gruhl et al. discovered the ‘predictive power of online chatter’ by researching the amount of mentions of book titles in the blogosphere and the correlating amount of sold books in real-world Amazon.com.[4] Asur et al. did a comparable research with the chatter of Twitter.com in forecasting the box offices revenues for movies. Their research outcomes seemed to be so promising that they claim ‘that a simple model built from the rate at which tweets are created about particular topics can outperform market-based predictors.’ [5] The methods used by the Asur can be applied for more forecasting than theater movies alone ’Moreover, gathering information on how people converse regarding particular products can be helpful when designing marketing and advertising campaigns’.[6] 

The first thing that came up my mind was that the box-office revenues are related to the amount of publicity generated for the particular movie. And indeed Asur et al. first researched all the effort that’s taken to create attention and buzz about the movies on Twitter. The next step in future prediction is studying the sentiments by using text classifiers to distinguish positively oriented tweets form negative related to the particular movies. Not surprisingly the more positive Tweets the more revenues the movies will generate, at least that’s the hypothesis.

By using the Twitter Search API, Asur et al. extracted 2.89 million tweets related 24 movies over a period of three months. Before the release of the movies media companies generate promotional information in many different forms, trailer videos, news, blogs and photo’s. [7] This gives the possibility to extract tweets related to the movies by following particular URL’s that lead to the different online content. Also retweets, the forwarded tweets, are of importance while such tweets are important because of the dissemination to the complete friend-list. The prediction of the first weekend Box-office revenues was in fact more accurate than the Hollywood Stock Exchange considered the gold standard. Asur assumes that after the release of the movies the value for the sentiments rise increasingly and by measuring the ratio of positive tweets to negative a box-office revenue can be predicted. This research have shown that by using social media research accurate forecasting’s can be done. In fact the accuracy of the Twitter research method showed an astonishing 97.3 percent.  The reason why movie revenues where subjected to research was for the simple reason that a comparison can be done with other methods. This means there are many ways other topics this methods can be used on for instance designing marketing and advertising campaigns. Therefore the researchers claim’At a deeper level, this work shows how social media expresses a collective wisdom which, when properly tapped, can yield an extremely powerful and accurate indicator of future outcomes.’ [8]

 

So what’s the reaction of others? Is this exciting? Brian Solis, who is principal of Futureworks, an award winning New Media agency in Silicon Valley in a reaction on the research described above, writes ‘Twitter’s trends is a cultural mirror that reflects the state of attention and intention. And as such, Tweets then offer an MRI that visualizes the minds of consumers and more importantly, serve as a crystal ball that reveals the future of products and services before and soon after they’re released.’[9] Further, Solis notices that many businesses are by now trying to engage in the conversation with the consumer in stead of ‘capturing and analyzing that inherently inspires empathy and ultimately relevance.’ [10] This will change immediately partly due to the research of Asur. Definitely, if businesses are about to change their process there are changes coming for the consumer as well. A smart business will try to harness as much information from social networks such as Twitter and use it in their methodologies of product design and services. In this way it’s the consumer’s turn because the engagement between consumer and business will be form the outside in and from the bottom up as Brain Solaris states.

 

 

References:

[1],[2],[3] Cheong, M,. Vincent Lee, ‘Integrating Web-based Intelligence Retrieval and Decision-making from the Twitter Trends Knowledge BaseACM New York, USA, 2009.

[4] D. Gruhl, R. Guha, R. Kumar, J. Novak, and A. Tomkins, “The Predictive Power of Online Chatter,” in 11th ACM SIGKDD Chicago, IL: ACM, 2005, pp. 78 – 87

[5], [6], [7], [8]S. Asur, Bernardo A. Huberman, “Predicting the future With Social Media” National Science Foundation, Computing Research Association CIFellows Project, 2010.

[9] Solis, B. ‘A Prediciton: Twitter to Predict the Future’ http://www.briansolis.com/2010/04/a-prediction-twitter-to-predict-the-future/, consulted on 09-10-2010.

My first Wikipedia entry

•October 4, 2010 • Leave a Comment

Last friday 1st of October I published my very first Wikipedia entry ‘Howard Menger’ on the Dutch Wikipedia. The choosen topic actualy was based on a couple of simple Wikipedia-facts: First, when you want your Wiki-entry to have some reasonable lifespan you should go for a topic that’s missing in an already existing root entry. In my case, I found the name ‘Howard Menger’ on the dutch Wikipedia entry ‘Unidentified flying object’ written in red characters which means that there is no such entry present in the Dutch Wikipedia yet. Second, you improve your chances of being a Wikipedia-contributor if you act on this first fact by searching for some information about the topic you want to write about. Information in a encyclopedia need to be referenced therefore you need the references first. There was already an English Wikipedia entry on ‘Howard Menger’ so this was for me a good starting point, not only for the information about Howard but also for the layout of the text. The first step for me was to translate from English to Dutch. So essentialy my first Wikipedia entry was actually a translation job. I wondered wether what I was doing had some Wikipedia relevance. To find out I posted the entry relative early to see whether Wikipedia bots or administrators would disagree with me. After the visit of Erwin85Tbot the entry it seems to be alright. I have the feeling that the more important Wikipedia-entries, those that are visited more often, are a priority for the Wikipedia administrators. Therefore I believe that the relevance of your posted information is of influence on the lifespan of your entry. Today, my entry was visited by another bot RedBot, a bot that checks the grammar of the text. Also, Philip Man found a typo on my entry and set it right, Thanks man!

Despite all the warnings that Wikipedia entries are at risk of elemination I found it rather easy to keep my entry alive till now.

MIN(D)ING Your Data

•September 27, 2010 • Leave a Comment

Last week, ‘surveillance and control’ was the theme the Masters of Media-students were debating about. Theorists as Michel Foucault, Deleuze, Galloway and Thacker and Chun passed the revue. With the articles of the last three theorist there was some pessimism going on during the discussions.

In their article ‘Protocol, Control and Networks’, Galloway and Thacker write about the network discourse that can be seen as a way to understand political control in society ’It is a totalizing control apparatus that guides both the technical and political formation of computer networks, biological systems and other media.’ Needless to say, most of our activities nowadays depend and are ruled by the distributed network. The individual is free in disseminating information, do business and spends a fairly big part of free time while surfing the net while leaving virtual traces. These online practices can be experienced as freedom in a positive way but Galloway and Thacker apparently have chosen the pessimistic point of view as they state: ‘this distributed character in no way implies a freedom from control’. Derived from the notions of the ‘dividual’ of Deleuze, all data on the web is subject to collection and thus we are. In the article, the two theorists are debunking the myth that the Internet is a celebration of freedom by focusing on the dangers as well. Wendy Chun on the other hand, is mediating between the, what she calls, ‘extramedial representations’ by which she means the utopian/dystopian notions and ‘research’ approaches about the freedom and dangers inherent in protocol and indispensable for the very existence of the Internet. Chun’s research is focused on the actual operations and operation failures of the Internet as seen as a global surveillance machine. Chun isn’t downplaying the analyses of Deleuze used by Galloway and Thacker; ‘This is not to say that Deleuze’s analysis is not correct but rather that it –like so many analyses of technology- unintentionally fulfills the aims of control by imaginatively ascribing to control power that it does not yet have and by erasing it’s failures.’ (p.9)

By analyzing the status quo of the Internet, by many imagined as a ‘global’ surveillance machine, Chun investigates the networked media and states that the Internet can’t exist without accepting the vulnerabilities users experience ‘Importantly, without this incessant and seemingly disempowering exchange of information, there would be no user interactions, no Internet. The problem is not with the control protocols that drive the Internet –which themselves assume the network’s fallibility- but rather with the ways protocols are simultaneously hidden and amplified.’

As previously mentioned, during the workgroup discussions of these different perspectives described above, I felt the dystopian notions about analyzing online personal data prevailed. For instance, Facebook users are subject to marketers profiling practices which for many students was felt as an abuse of their online traces and digital identity. The online personal content and conversations is confiscated by voluntarily agreeing with the terms and conditions of the social media platform. Although many users are aware of this fact, they agree, often without actually studying the terms and conditions. As the discussion went on, I felt the pessimists were overrepresented.

To keep things simple, imagine there are two parties in this debate. We have the pessimists and the optimists on the consumer surveillance theme. One party advocates the harvesting and analyzing of online personal information and to other party does not. In those parties you could place the consumer on the one side, and the owner or marketer using the scrapped data from social media networks for their advantage in the other. Where does this position the new media researcher in analyzing these systems of data retrieval and profiling? According to Richard Rogers who is the Director of the Digital Methods Initiative, there are concerns ‘how to make use of the copious amounts of data contained in online profiles, especially interests and tastes.’, which are in need to be answered by researchers like us, The Masters of Media-students who are likely to become the new media researchers. Rogers calls his approach ‘post-demographic’ which could be seen as ‘the study of the data in social networking platforms, and, in particular how profiling is, or may, be performed.Of particular interest here are the potential outcomes of building tools in top of profiling platforms,..’ (Rogers 2009) Post-demographic need to be understood as a different method than the traditional demographic research practices where, ‘race, ethnicity, age, income and educational level – or derivations thereof such as class..’ are subject of analyses. The ‘post-demographic’ research method instead is interested in ‘tastes, interests, favorites, groups, accepted invitations, installed apps and other information that comprises an online profile and its accompanying baggage.’(Rogers 2009: p 30)

My point is that the most pessimistic approaches of the consumer surveillance are actually counter to the very existence of this interesting field of study and position of the new media researcher. An approach which, correct me when I am wrong, maybe needs to be reconsidered by new media students who apparently have negative feelings about this research practice. As Rogers stresses, this method of doing research is very productive:

the platforms continually encourage more activity, inviting commentary on everything posted, and recommending to you more friends (who are friends of friends). With all the ties being made, and all the activity being logged, the opportunities for analysis, especially for social network researchers and profilers, appear to be boundless.‘(Rogers 2009)

To conclude, in 2008, a team consisted of UvA New Media analysts and technical developers worked on a project called ‘Vriendjespolitiek.net’. The project dealt with the way in which the characteristics of individuals could tell something about the individuals in combination with the characteristics of their friends/connections. The personal information is retrieved from the Dutch popular social media networking site Hyves. This project is a example of an post-demographic research method where ‘by providing appropriate visualizations we show both the demographics and the relations of a group of pals, and replicate the existing, arguably anti-participatory democratic, voting recommendation machines. Ultimately the goal is to raise awareness of one’s digital public self – one’s data body.’ (Well.com 1995) – to create conscience of simple yet powerful profiling techniques, and the tools of the surveillance and control society.’ So by doing research and actually utilizing ways to give the public insight in someone’s online appearance, there is a way to both use profiling techniques and let the users benefit from the transparency of the tools of surveillance by which the new media researcher, I think, is positioned in a somehow neutral position.

References:

–  Rogers, R. (2009) “Post-Demographic Machines”. In Annet Dekker and Annette Wolfsberger (eds.), Walled Garden. Amsterdam: Virtueel Platform,    29-39.

–  Chun, Wendy. (2006) ‘Control and Freedom. Power and Paranoia in the Age of Fiber
Optics’. Cambridge, MA: The MIT Press.

–  Galloway, Alexander & Eugene Thacker. (2004) ‘Protocol, Control, and Networks’.  Grey Room, Inc. and Massachusetts Institute of Technology

–  Vriendjespolitiek.net. (2010), ‘vriendjespolitiek_paper.pdf’, retrieved at september 25, 2010.

The Chinese Room

•September 24, 2010 • Leave a Comment

Friday, 17 september, in Hague’s Korzo5HOOG theater, I, with companion, got immersed in the phantasies of dancer and choreographer Kenneth Flak. Inspired on the thinking experiment ‘The Chinese Room’ placed in the context of the notions of William Gibson’s ‘cyberspace’, the dance performance was a pleasure to behold. Accompanied by the multimedia artist Matsuo Kunihiko and light artist Thomas Dotzler it was a interesting research of Humanity 2.0. In a, on Foucault’s inspired, panoptic setting the audience witnessed the most physically exhausting choreography in an attempt to get the message abroad. The two dancers, Küllie Roosna and the architect of the art piece Kenneth Flak thried to get on top of eachother, using their bodies as tools to form an organic and, I suppose, ‘mechanical’ whole. In the background Matsuo Kunihiko projected images on three screens, molding abstract moving images with webcamvideo’s of chatting people. The climax of the dance performance was that Küllie Roosna produced a horrific loud sound with a device attached to here arm. This device was measuring speed and hight, as a result the sound could be best described as a form of howling. Ofcourse the performance came to me as a little bit ‘dated’ since the inspiration was based on fantasies stemming from the ’90. It did looked like the dancers acted as human/machines, or cyborgs as you will and at times seemed only be in the virtual space of the projection screens. The latter wich reminded me of the movie ‘the Lawnmower Man’ from ’92. And in fact this is where the story in a way returns to the title of the performance, the thought experiment of John Searle.

Turing asked himself if ‘artificial intelligence’ was possible, or better, if an interrogator could be fooled by a computer into thinking that the conversation was with a human. In a reaction on the experiment of Alan Mathison Turing, John Searle reacted with an other experiment he called ‘The Chinese Room’. With this experiment he stated that strong AI couldn’t be possible since the computer ‘computes’ a program, and a person who could also follow the same steps, following instructions of the program, could come up with the same answers as the computer. The argument is, that if the person doesn’t understand Chinese at all (which in this case the person indeed does not), the answering of the Chinese questions could be right since the person uses the program just like the computer does. The role the persons plays and the role of the computer in the experiment are the same. Since the person doesn’t understand Chinese there is also no part of the computer that understands Chinese. Searle states that without ‘understanding’ the computer can’t possibly be accountable for as a machine that is ‘thinking’. It doesn’t have a mind capable of understanding things.

Although Searle’s argument seems plausible there are chatbots designed to simulate an conversation. This year the Artemis chatbot has won ‘The Chatterbox Challenge’ and is downloadable here. But after a while chatting with Artemis, named after a goddes from the Greek mythology, I concluded it wasn’t really an entertaining conversation as she continuously throws the questions back. But there is hope, shown is this entertaining Youtube movie concerning ASIMO’s primitive learning capabilities.

Although the inspiration of Kenneth Flak seemed to me a little bit ‘dated’ at first, after seeing ASIMO doing his tricks I think this will continualy will be a subject of discussion. And, where it apparantely didn’t work in the ’90 despite the effords of Jaron Lanier it seems to me we probably have ‘machines’as ASIMO as companions sooner than we expect.

Book Review: Files, Law and Mediatechnology – Cornelia Vismann

•September 19, 2010 • Leave a Comment

In here book, Vismann writes a geneology of the media-technological conditions of files and recording devices with a view to their largest area of application, the law. Files in this geneology are defined as the medium between the authority and administration while three entities; truth, state and subject are under constant modification in the forming of Western institutions. In her work, Vismann does not write about the content of the law but elaborates how files and administration technologies in service of the law form assumptions of truth, state and subject in Western history. As a result, Vismann’s geneology investigates the continious formalization and differentiation of the law. With the model in mind of the Rechtsstaat, Vismann gives insight in the abstract law on one side and the agencies that set down and enforce the law on the other. Mutations of the processing of files and mediatechnologies are an essential aspect by means to preserve power and authority. The book is written for anyone who is interested in or uses file processing and administration technologies.

Vismann kicks-off by exploring oral cultures in searching for the origin of the law and writing by using the work of enthnologist  Claude Lévi-Straus in combination with the commentaries and theories of the founder of the deconstruction theory  Jacques Derrida.  In this sense she gives an answer to the question how the law is function without record keeping devices. Vismann analyses en deconstructs the literary works of Kafka and Bartleby to bring to light the realities of the law whom otherwise, according to Vismann, would be kept unnoticed. Further, Vismann describes the methods of philologist Budé in the search for the ur-text by analyzing the body of the Roman law issued under Justitian I. With this analyses the forming of the law system of the empire of Rome is set out. Because of the rapid proliferation of files with organizational and, for instance, retrieval problems as a result, mediatechnologies and administration tactics needed to be developed.  Egyptian papyrus, kept on rolls, was a common used medium but wasn’t suited for archiving or fast retrieval purposes due to it’s serial text searching. Therefore the first wax ‘notebook’  or codecs made it possible to browse through text accompanied by the new invention the ‘stylus’ by which smaller writings was possible.

By these new characteristics of mediatechnologies, administration and file processing was revisioned. The work of Vismann is solid and rich on reference material. It gives the reader an insight that ranges from the work of the Roman magistrates to the concern’s of the public’s personal files kept and processed by the government. The complete media-archaelogic is set out from oral cultures to present in which Vismann sees the reappearance of files as stylized icons on the computerscreen as the closure of an epoch of file processing. The architecture of digital machines can be seen as organized by files and their organization techniques with the central processing unit which controls all that goes on in the computer. In this sense the history of files contains a prehistory of the computer which makes Vismann’s work a great contribution for further analyses.

Files, Law and Mediatechnology

Cornelia Visman

Translated by Geoffrey Winthrop-Young

Stanford University Press, 2008

216 pp, 13 illustrations, English

ISBN-10: 0804751501
ISBN-13: 9780804751506

Remediation pur sang

•September 12, 2010 • Leave a Comment

Last tuesday I discovered the following news bulletin on NU.nl ‘Mogelijk tijdslot voor internet- uitzendingen publieke omroep’  in which is explained that resigning minister of Education, Culture and Science André Rouvoet plans to use a timelock on adult televison programs published on Uitzending Gemist (wiki) , an online televisionprogram archive of the Dutch public broadcasters. This law is already applied on  the cable television network but  according to minister Rouvoet seems to be of use on New Media aswell. Needless to say, this law is to prevent underaged from watching ‘non-suitable’ content.    

The point is ofcourse: By blocking the content, adults who are legaly justified to watch adult content suffer from this the most. Many underaged, I believe, already found they’re way to adult content and wil have acces to the Uitzending Gemist timelocked content via Youtube as you can read in the news bulletin. It is in fact possible to reach ‘time-blocked’ content via a proxy server in an different time-zones. Questions about the effectivity of this law are therefore justified. Ofcourse, and happily, this news bulletin didn’t kept unnoticed from politicians such as Martijn van Dam (PvdA) and members of the Groenlinks party who announced asking questions in de Tweede Kamer about this case. These questions show the ‘new media’ perspective and the ‘law tackling’ strategy of the questioner. Clearly the questions emphasis focuses on the interpretation and the effectivity of this specific law. This is a logical step because this law has to be in the middle of the discussion for the best changes to adjust this law suitable for the medium. That this law is not suitable for the medium is also a point made clear by the questioner. And in fact shows the ignorance of the properties of new media by the people who wrote this law. I think this is a rather worrisome development since these laws influence our experience of new media.

Theorists Bolter and Grusin claim they’re ‘remediation’ theory applies for all new media and therefore is an characteristic. <1> But what is also stated in the book of Lev Manovich is that the pioneers of the personal computer Kay and Goldberger envisioned the personal computer to be a machine remediating old media with extra properties. For instance in simulating a paper document onto a computer screen ‘his idea was not simply imitate paper but rather to create “magical paper”’ <2> The new media characteristic that is jeopardised by the law in question is the ‘on demand’ function that is being hijacked and therefore takes away the magic in our experience of using new media. The fact that since the 1980 pioneers have taken effort to get this ‘magic’ in new media makes it all the more important to defend new media’s characteristics for the laws of people who cannot see what the ‘magic’ of new media is about.

<1> Manovich, Lev. ‘Software takes command’  Software Studies Initiative, licensed under Creative Commons, 2008.  Consulted 7 sept 2010: http://lab.softwarestudies.com/2008/11/softbook.html

<2> Manovich, Lev. ‘Software takes command’  Software Studies Initiative, licensed under Creative Commons, 2008.  Consulted 7 sept 2010: http://lab.softwarestudies.com/2008/11/softbook.html page 42.