As followers of News Paper Death Watch and general observers of newsstands are well aware, smaller and notable papers alike have been either shutting down or switching exclusively to online versions. Ad money and circulation numbers are down. Surviving papers are largely bleeding money and living off of borrowed time. The news is still getting out, but fewer and fewer Americans are purchasing physical products to get it.
With Amazon pushing the Kindle and the growth of the cloud, even with its issues, the move is ever more away from physical media of any form. Music is downloaded, customers stream movies from Netflix, books are digital and software is moving online, piracy aside. Discs are becoming a less popular means of distribution in favor of a digital alternative that has and will continue to destroy the market for print media. Print media will still have its place, but it will no longer be a general purpose item. Print newspapers are readily becoming an item for the elderly and nice customers. Once the elderly customers inevitably die off, that leaves almost only the niche.
The consequences are evident to anyone who knows the reality of the differences in profitability between print and online advertising, that being that the latter pays a fraction of the former. This will mean an even greater reduction in overall advertising revenue for the journalism industry, likely leading to the employment of even fewer journalists doing more limited coverage, both in scope and in depth. The journalism industry is tenacious, but it has numerous future challenges to contend with.
For the most pertinent news regarding an event at the importance level of the status of Sony’s Playsation Network, those interested would not have turned to the New York Times and other major newspapers, who had more arguably important matters to cover. Rather, they could turn to such blogs as Ars Technica and Destructoid to get the latest updates on the situation.
The importance of the story aside, these blogs served as valuable sources of information for those who wanted to know. But does that make these bloggers journalists? Blogs vary in their level of professionalism. Ars Technica and Destructoid have editors who, ideally, vet stories before they get published. But anyone can start a simple WordPress blog, just like this one, and have an equal platform to publish their views. Should those at Daily Kos and Free Republic and wherever else be said to be doing journalism?
There is certainly an element, amongst both older and younger journalists, that would refuse to bestow that title upon bloggers. The notion of anyone putting out anything and living by their own standard of ethics and conduct goes against the grain of the most established names in journalism. Of course, creative fields rejecting something new as not being “legitimate” is nothing new. “Is rap really music?” “Is this work really art?” “Should this really count as literature?” And so on. With age, most large scale additions simply come to be accepted.
And it is not the case that mainstream journalistic outfits do not have bloggers. Nytimes.com has bloggers, for example, and most journalists blog and tweet on the side to promote themselves. Huffington Post even made it big on a platform consisting almost entirely of bloggers. If bloggers are not journalists it is because our definition of “journalism” has not caught up with the modern era, because news and blogs have become nearly inseparable at this stage.
Ars Technica is milling the prospect of having more user contributed stories on their front page. Currently, the option to only have Ars writers publish stories on the front page is leading by a slim margin, but it is only a plurality rather than a majority. News organizations clearly see their users as exploitable sources of free content, but the question is how far to go with it. CNN’s most ambitious move towards user generated content is iCNN, where the content occasionally gets used as part of CNN’s main coverage. Other sources, such as the Huffington Post, rely almost entirely on user generated content and can be, as in the case of the Huffington Post, quite successful from it.
The debate is between keeping content produced strictly by professionals, thus theoretically ensuring quality and accuracy, or to open the floodgates to a wealth of free content, but at the risk of lower quality and potentially false information. The former maintains integrity, and still allows for profits. There is certainly a value in being a reliable source. The latter approach, to varying degrees, discards integrity in the name of low effort and high profits. Ultimately, unless one stops being profitable in the future, it is not likely that either will completely win out over the other. Sites will continue to incorporate user generated content at iCNN levels, Huffington Post levels or eschew it entirely. But user generated content is not about to go away, nor is it going to pervade all news sources. The New York Times, for instance, is not likely to have user generated articles anytime soon. Instead, their will be a dual existence, and news consumers will be the better for the wealth of options.
According to data, 56 percent of Americans have broadband data caps, prompting groups to ask the FCC to investigate the matter. ISPs argue that it is only fair that those who use the most bandwidth pay for the higher than average costs they generate. That is an incredibly flimsy argument. Bandwidth is rather cheap, with gigabytes available for mere cents at regular bulk rates, and ISPs tend to pay even less. The notion that they somehow “need” this extra income on top of their already heavily marked up monthly rates is both asinine and insulting to consumers’ intelligence.
So then why data caps? In part, because no corporation would ever say no to more money. But the larger reason is because many of the same companies that operate ISPs also own enterprises threatened by the growth of the net. The Internet threatens journalism, movies, television, music and almost every conceivable form of content. But if they can make it prohibitively more expensive to utilize Hulu, Netflix, Pandora and other competitors, then they do not have to compete on a level playing field at all. Superior alternatives to their existing products exist, but as they often have majority control of the territories in which they operate, they can simply prohibit entire areas from being able to take full and equal advantage of these competitors. Data caps are not about making users pay their fair share, but it is about anticompetitive practices. These companies are prepared to hold onto their profits even if they have to hold back America to do so.
Past efforts by the US government to take down websites by seizing domains have been, in the most charitable of terms, laughable. Affected sites simply obtained new domains from services outside the US and were back up quickly, often with merely a change in their top level domain and the same name as before. So, in the United State’s latest bid to police and censor the Internet, some senators have proposed a bill that would allow them to blacklist sites, seizing the domain and legally requiring search engines to no longer index them.
What a precedent that would set. The current guise is just to combat piracy, something supposedly in the best interests of corporations. But once it’s okay to do that for one category of site, there is an agreement that it is ever okay. From there, the reality that it could be used to suppress groups the government, or their corporate backers, do not care for is fully realistic. They would no longer have to justify the action, but merely the reasoning. Terrorism, the public good and national interests are vague enough terms to allow for quite a bit of room.
Any step towards China’s “great firewall” is a threat to the freedom of information. The protection of free speech necessitates the protection of unpopular speech. The moment one entity is given the authority to declare what is and what is not acceptable speech is the same moment in which speech ceases to be free. No entity truly has the grounds to make such a declaration, nor should one be granted such a platform. It is easy to dismiss this bill and wave it off as merely combating piracy. But if you ask the Trojans, they will tell you that sometimes a horse is not merely a horse.
Newt Gingrich recently announced he was running to be the Republican candidate for the presidency in 2012. What are his odds? That’s hard to say, as elections can be unpredictable. Obama was widely written off in favor of the unstoppable Clinton, and history knows how that turned out. So where did Gingrich go to make this sort of announcement? Some local paper would surely be too small, so perhaps he went to a national paper. Maybe the New York Times, for example? No, print media is too quaint. Surely he made his announcement first on some cable news channel, say Fox News. He is that channel often enough that it would make sense. But he did not do it there, or on any other news channel.
No. Gingrich tweeted his announcement with a link to a video. A potential presidency launched in under 140 characters. Now while, yes, this might come across to some as an old man who simply told some staffers to “make sure we use that Internet thing” after it worked out so well for Obama, it is an acknowledgement that cannot be ignored. A potentially major figure has decided the place he wanted to break news in a way that would reach people was to use Twitter. And of course, using Twitter naturally means he chose not to use any of the other more traditional methods. Gingrich, like anyone serious about making a presidential bid, is likely surrounded by very intelligent staffers who would not have executed such a PR move on merely a whim. The judgement is in, and Twitter is rapidly becoming the stronger horse to bet on.
For anyone who used the Playsation Network on their Playstation 3, the saga of the PSN’s continuing downtime has been difficult to miss. Direct updates from Sony have been scarce and cryptic, leaving many in the dark as to the status of the PSN. One of the major game news sources, Game Informer, recently published an article claiming that the network was back up for developers. Their source for this was a post by a user on a popular video game forum, who they never contacted before quoting.
Now, it would be easy to look at this and feel smug at the lackadaisical nature of video game journalism as opposed to real news. That is until one, say, turns on CNN and sees them using unverified tweets and blog posts as sources. CNN and others are all too pleased to rush into putting out only potentially accurate information. If it turns out right, they were ahead of the curve. If it is wrong, then whoops, sweep it under the rug. You can’t blame the outlet for being wrong since we all knew the risks of trusting speculative information, right?
Getting scoops is a classic part of journalism, but perhaps it is reasonable for consumers to have larger demands of “hard” news than is given to entertainment products. The move towards putting out plausible stories and waiting for the facts to potentially come later to back it up is a disservice to news consumers. The truth deserves better treatment than the latest Madden or Call of Duty.