Forecast Calls for Cloudy Skies

Once we called upon Zeus and Horus. Then we created machines to serve and rule us. Now we cast our eyes anew to the heavens to conjure up what we want.

What fresh magic is this? It’s cloud computing, shifting software from your office, desktop, lap, and pocket to remote data centers that you access through the Internet. Setting you free in one sense, yet also making it more difficult to escape the reach and watch of technology.

Discs are as dead as the Victrola: increasingly you’ll go to the cloud to download apps for your limited purpose, stream a movie, or look for extra server space.

It may sound a little nebulous, but one thing is clear: it’s clearly getting more overcast all the time. The worldwide market for cloud services is likely to grow to $148.8 billion in 2014 from $58.6 billion in 2009, according to Gartner Research. “The last time companies saw this big a shift in computing was when PCs entered the workplace over 20 years ago,” says Ben Pring, a senior research analyst at Gartner. In the next five years, companies will spend $112 billion cumulatively on software delivered over the Internet (a.k.a., “software as a service”) and comparable services, according to Gartner.

By November 2009, about 100,000 companies used cloud applications, according to Bruce Richardson, former chief research officer at AMR Research. In a May 2008 report, Merrill Lynch estimated that 12 percent of the worldwide software market would go to the cloud by 2013 and be worth $95 billion.  [Source: BusinessWeek, Check out the article for good examples of companies saving money with the cloud.]

Cloud computing isn’t new — Gmail, Flicker, Picasa, Dropbox, and streaming media are all cloud-based services. But storms are brewing as big players like Google, Apple, Facebook, Amazon, and Salesforce compete for your business. Security is an issue — personal data stored in the cloud is under constant hacker assault, although security experts say having remote tech experts guarding their servers (and your information) is safer than you doing the job yourself. Data ownership could also be a problem if the owner of the server you’re using changes the rules and decides it can sell your uploaded photos, for instance.

Whatever happens, look for your PC and its operating system to be gone with the wind before long. Gone will be the hassles that long marked our days: installing software, protecting it from viruses, remembering where you stored information, losing everything when you forgot to back it up. You’ll continue to need access to electrical power (and long battery life), but much of what you thought you knew about computing is changing.

The FCC’s Aborted View of Net Neutrality

In one of those Solomonic decisions (less for its sagacity than for the result where no one is thrilled at the prospect of half a baby), the FCC passed a “net neutrality” ruling that guarantees consumers the right to view content (a check on the power of cable and phone companies and Internet gatekeepers) – but allows service providers to charge more money for faster priority speeds, especially on mobile because of the network strain (congestion) on wireless networks. So, neither this nor that. Few on the right or the left are happy. But when nobody really gets what he wanted that passes for bureaucratic wisdom. Expect lawsuits.

The Business of Writing

Ah, the allure of freelance writing. The creative freedom. The flexible work hours. The intellectual stimulation. The grinding poverty.

Consider freelance business journalists. While no one expects them to earn as much as the people they cover, you might think given their field they had made a financially sound, strategically minded career decision. They typically earn about $25,000 a year (and no benefits or pension, of course). That’s according to a survey by Society of American Business Writers and Editors, which also found that three quarters of respondents made considerably more when they had salaries.

There are fewer of those full-time jobs in journalism, of course, with the outsourcing of the writing trades, and the technological extinction of the newspaper and magazine business. Their replacements, online content mills, do need copy of course … they’re just not willing to pay much for it, if anything. They’re inclined to interpret that “free” part of freelancing literally.

So do you really want to be a journalist today?


OK, not all freelancers are suffering. Specialized music writers can make $70,000 a year, according to research by Berklee College of Music, so biz writers might want to follow that Pied Piper.

And there is further hope, if only by way of analogy. Smartphones with their high-resolution cameras have pretty much obviated the need for traditional point-and-shoots. But sales of more powerful cameras like SLRs have increased nearly 29 percent since 2009, according to research firm NPD. Independent writers might think of themselves SLRs and market themselves accordingly, offering something that can’t be duplicated by some mug in Bangalore or Kiev cranking out keyword-laden ad bait at $5 a day.

Or look at the ongoing popularity of wristwatches. People surely don’t need them to tell the time (their smartphones do that too, and usually more accurately). They’ve gone from a necessity to an anachronism. But against the odds, against all reason, they go on and on. Maybe quality journalism will go that route.

Regardless, those freelance business writers probably don’t care. Two-thirds of respondents to that same SABEW survey said they’d never go back to a full-time job. You gotta do what you love. Food, shelter, and health insurance can be overrated.

Of Laughter and Never Forgetting

Time magazine is about to name its “Man of the Year,” the person who has made the greatest impression on the previous 12 months. From A to Z, Julian Assange to Mark Zuckerberg nicely bookend the short list and frame one of the great struggles of our epoch: privacy vs. transparency.

The WikiLeaker from Down Under is likely to get the nod for publicizing information from U.S. classified documents. To his way of thinking, candid assessments written for a limited group of decision makers must be exposed as perfidious. Thus, WikiLeaks has informed the world that State Department functionaries think Hamid Karzai is a crook and Italy’s Berlusconi is “feckless, vain, and ineffective.” Uh, tell us something we don’t know.

At the other end of the alphabet, Zuckerberg schemed to tell Facebook’s corporate partners many things they didn’t know about users of his site. He seemed baffled why anyone would want to hold back their personal profiles from the world at large. What are these people afraid of?

For one thing, people don’t want to be commodities to be bought and sold (although that battle is probably already lost). More than that, people instinctively want to be able to control their reputations, which they can’t when information about them is (a) false or (b) once true, but no longer or out of context.

People cling to the idea that they can have separate lives: one for home, one for work, one for friends, and so forth. They also want to be able to reinvent themselves at will – which requires moving on from the past, forgetting, amongst other things, indiscretions that seemed amusing at the time. Yet how can we drop this baggage when the Internet shackles us to every comment or image associated with us?

“A humane society values privacy because it allows people to cultivate different aspects of their personalities in different contexts,” writes Jeffrey Rosen in his outstanding article in the New York Times Magazine in July. “At the moment, the enforced merging of identities that used to be separate is leaving many casualties in its wake.”

It’s not a theoretical issue. Three-quarters of U.S. companies conduct online research on job candidates, according to Microsoft, and seven of ten recruiters report that they have rejected candidates because of discovered photos, discussion-board conversations, or membership in controversial groups.

There’s plenty of grist for the investigator’s mill. Facebook has nearly 500 million members, 22 percent of all Internet users, who spend more than 500 billion minutes a month on the site. Its users share more than 25 billion pieces of content each month (including news stories, blog posts and photos), and the average user creates 70 pieces of content a month. There are more than 100 million registered Twitter users, and the Library of Congress recently announced that it will be permanently house the entire archive of public Twitter posts since 2006.

Now advancing facial recognition technology promises (or threatens) to locate photos of people you’re looking for on the web, even if not identified (“tagged”) in the photo. Social-network aggregator search engines will be combining data from various sources to rank people’s public and private reputations. Then there’s the new web site Unvarnished, where people can write anonymous reviews about anyone. People are already rated on their creditworthiness. Soon they may be judged and ranked on the reputation as parents, dates, employees, neighbors.

By “erasing external memories our society accepts that human beings evolve over time, that we have the capacity to learn from past experiences and adjust our behavior,” writes Viktor Mayer-Schönberger in his recent book, “Delete: The Virtue of Forgetting in the Digital Age.” The limits of human memory ensure that people’s sins are eventually forgotten. He says “without some form of forgetting, forgiving becomes a difficult undertaking.”

Here’s the irony: the internet was until recently seen as the great liberator. Hilary Clinton, now on the warpath against WikiLeaks, praised Google for empowering Chinese citizens with information about their government. Remember that New Yorker cartoon from the early 1990s: “On the Internet they don’t know if you’re a dog.” Now the leash is back. We know who and where and what you are, Rover. And we’re never going to let you forget it.

The remedies range from legal maneuvers of dubious value (such as lawsuits to force removal of slanderous information or “Twittergation”) to technological innovations – such as built-in expiration dates for data, controlled by the user. Or just being prudent to the point of paranoia.

Supposedly cavalier about over-sharing, the young are catching up to their elders in matters of privacy. A UC Berkeley study this year found that 88 percent of people between 18 and 22 believe websites should be legally required to delete all stored information about individuals.

Facebook could implement expiration dates. If it wanted to. It doesn’t, apparently.

Bad information, like bad news, has a greater impact, as any behavioral psychologist or journalist or PR rep will tell you. So a new industry has arisen to buried the bad news that can’t be actually eliminated. Companies like ReputationDefender flood the Web with positive or neutral information about their customers to rig Google search rankings, pushing the negative links to the bottom.

Whether Time chooses Julian, Mark (or even Sarah) as its emblem of 2010, the bigger story is that technology is rapidly moving us through numbered versions of the world. We’ve left the user-generated content world of web 2.0, and we’re being shoved into 3.0. Welcome to it.

Inspiration Is Bunk

What does it take to do great work? Whether we’re talking about writing a novel or building a successful company, the answer is the same: Focus and persistence. Being in the right place at the right time (a.k.a., luck) helps, but inspiration has little to do with it. It’s mundane plodding that wins the day.

“I think inspiration is nonsense, actually,” famed author Salman Rushdie told interviewer Max Miller for

Writing, he says, requires deep attention – definitely not inspiration. Concentrate on your characters, he advises, live in their world and tell their story. Stick with it until you finish.

“Every so often, I mean like one day in 20 or something, you will have a day when the work seems to just flow out of you and you feel lucky,” Rushdie says. “I wish there were more of those days, but most of the time it’s a lot slower and more exploratory and it’s more a process of discovering what you have to do than just simply have it arrive like a flame over your head.”

You can develop skills of concentration, which is something that strengthens over time (unlike, say, energy level). That’s good news not just for artists, but for careerists and entrepreneurs.

Fail, fail, fail … until you succeed. That’s the story of art and technology, of social progress and individual success – in every walk of life. Keep your focus. Concentrate intensely. Be persistent and never give up

Simply keeping one foot in front of another is how many walked their way to fortune. “A surprisingly large number of people have made fortunes because … they just have unbelievable focus on accomplishing what they sent out to do,” says Peter Bernstein, co-editor of All the Money in the World (Knopf, 2007), a book about the richest people in America.

As you slowly propel yourself forward with your art, business, or life, what’s the most important thing to keep in mind? “No. 1: Don’t look back,” Harold Hamm, the 13th child of sharecroppers who made himself an oil billionaire, told Bloomberg Businessweek. “You can never get good direction from looking backwards. Just know that you’re going to make mistakes. Learn to survive those errors and hope they’re not so critical that you can’t survive them. Learn and go on.”

The key is to move forward, forward, forward. “Persistence is everything,” Hamm says. “Very few people have the persistence that they need to achieve the great things. I can’t say hardly enough about that. It’s so important to have persistence to see something through.”

Wouldn’t Mr. Rushdie agree?