Join the Media Squatters discussion group

November 4, 2015
Don't Sell Your Friends: Social Media as Social Programming
New Media Lecture Series at Purchase College
Purchase, NY

November 12, 2015
Communication in a Real-time Reality
The Arthur W. Page Society
Chicago, IL

November 14, 2015
Superfluid Economics
Platform Co-op Conference
New York, NY

December 3, 2015
NYU Futures Conference
New York University
New York, NY




CNN: Apple's new iOS follows the leader

Phil Schiller, Apple senior vice president of worldwide marketing, touts the new iPhone 5C last week at Apple headquarters.

(CNN) -- A few million iPhone users woke up to a new operating system this week, the much-heralded iOS 7, the biggest change in Apple's smartphone operating system since the phone was first released in 2007.

It's a different look and feel, for sure. But for those who may fear having to learn some whole new way of using a phone, rest assured: Not much has been fundamentally altered. It's still very much the same operating system you've made a part of your life.

That's really the news here: Apple is taking its users one baby step toward the next mobile platform. And while it may indicate something about where Apple hopes to bring smartphone interfaces in the future, this incremental step may say more about Apple's diminishing ability to make changes in our computing lives by decree.

Back in the Steve Jobs era, Apple could force its dedicated user base to go anywhere Steven told them. The original, VW Bug-inspired iMac was the first major desktop that did not offer a floppy drive. Users grumbled and panicked, but they did as Steven told them, and now we live in a world where many readers don't even know what a floppy drive is.

Then Steve took away serial ports, and so on, continually forcing new changes on Apple users for their own good. Consumers even lined up to purchase the iPad on Steve's command; they didn't have any real need for the thing but trusted that Steve knew best. The ongoing success of the tablet seems to suggest that he did.

Apple's new operating system doesn't ask users to make such a leap of faith. Yes, the new iOS is brighter and more colorful and the icons are simpler and less three-dimensional. It looks like a new phone. And the fonts are more streamlined and communicative.

Instead of using lines to separate things, font shapes and sizes are used more effectively to delineate items in lists -- making everything a bit more airy. Most significantly, the "things" depicted in the new OS look less like things. There are no little binders drawn on notebooks, no details on buttons, no leatherette address book and no shadows behind pages.

If anything, Apple is weaning us off the metaphorical desktop, trying to bring us into a virtual universe where we no longer depend on the familiar anchors of the "real" world to relate effectively to our applications.

On a smartphone, an address book is not really an address book, after all. It's an application that lets us search through data about our contacts. A notepad is not a pad with paper; it is a way of saving text. While the real world metaphors initially helped us understand what all these things were for, apps needn't be bound by the limits of real world devices. For the computing world to develop further, it needs to be freed of the obligation to do everything in ways that reflect real world processes.

Perhaps surprisingly, Microsoft already accomplished this last year. With nothing to lose, the company worked with Nokia to develop a smartphone and tablet operating system called Metro that no longer has a desktop, as we traditionally think of it.

Instead of depicting icons against a background, Metro breaks the screen up into tiles that deliver live feeds of whatever is happening in the apps. Streams of tweets, incoming e-mails or changing weather all flow from their tiles in real time.

Meanwhile, the apps themselves, once opened, have almost no lines or grids in them at all. The incoming e-mail list (there's no in "box" anymore) is broken up with space, not lines. Everything is accomplished with innovative font selection and sizing.

Microsoft's operating system represents the future of computing. It was also a market failure, at least in the short run. That's because Microsoft doesn't have the power or charisma to demand that users follow it into the unknown. People followed Steve Jobs because they believed in his vision -- even if they didn't know what it was. Microsoft doesn't enjoy that benefit of the doubt.

But today, without Jobs, neither does Apple. That's why -- instead of dragging us kicking and screaming into the future Microsoft has already developed -- Apple is trying to lead us there one easy step at a time.


Present Shock: The Audible Book!


Discover Magazine: A Manifesto for Living in the Now

Rushkoff says the future is bright for those of us willing to live in the present.

By Gemma Tarlach|Tuesday, August 27, 2013

Are you reading this on a smartphone or tablet? Even if you’re not, you’ve probably got a mobile device within reach. Congratulations. You’ve got at least one foot in a brave new world, says author and documentarian Douglas Rushkoff.

His most recent book, Present Shock: When Everything Happens Now, explains our cultural transition to “presentism,” a post-clock society enabled by the flexibility and reach of the Internet and those ubiquitous ways to stay connected to it. In a presentist world, we’ll work less but more efficiently and be free from the need for constant economic expansion.

Rushkoff told DISCOVER Associate Editor Gemma Tarlach the future is bright for those of us willing to live in the present.

Discover: Are some people confusing the idea of “presentism,” of living in the present, with tweeting and texting and constantly updating Facebook?

Rushkoff: The faux now of Twitter updates and things pinging at you — all the pulses from digitality that we try to keep up with because we sense that there’s something going on that we need to tap into — are artifacts, or symptoms of living in this atemporal reality. And it’s not any worse than living in the “time is money” reality that we’re leaving. 

D: What do you have against clocks?

DR: Time has always been used against us on a certain level. The invention of the clock made us accountable to the employer, gave us a standard measure and stopwatch management, and it also led to the requirement of interest-bearing currency to grow over time, the requirement of the expansion of our economy. That’s not really consonant with a sustainable civilization.

D: In an ideal world, how exactly would this new, post-clock era work?

DR: First and foremost it would unshackle us from this very time-based money that we’re using. Working less, making less, producing less. The mandate for efficiency of the industrial age is not to produce things more efficiently, but to produce more things over time. We’ve had to keep looking to increase. 

Now, for example, the more people transact directly over things like Etsy, the worse it is for the macroeconomy. The industrial age was not about craftspeople trading peer to peer. It was about stopping that. You weren’t supposed to be a craftsperson, you were supposed to be an employee. 

Take retirement: You hoard money now in order not to work when you’re older because you’re on your own. I don’t know of any other form of life that gathers up all the food it needs in the first two-thirds of its life in order to do nothing in its last third of life. In a utopian presentist society, instead of working extra hard to put money in the bank, you’d be working to provide value for the people around you. As you got old, those people would naturally want to take care of you.

D: That sounds a bit idealistic. Don’t you think people freed from the constraints of a clock-based economy and society are more likely to go a little Mad Max, especially if they have to buy their clothes on Etsy?

DR: I do believe humans can rise to the occasion. I think human beings are not necessarily ruthless. They can be. Look at those cultures that push old people off cliffs instead of caring for them. That might be the true presentist society. I guess I’ll find out.


CNN: Manning Verdict Won't End Government Transparency in a Digital Age


(CNN) -- Pfc. Bradley Manning, who provided classified government documents to Wikileaks detailing, among other things, America's undisclosed policies on torture, was found guilty of espionage on Tuesday. The verdict comes on the 235th anniversary of the passage of America's first whistle-blower protection law, approved by the Continental Congress after two Navy officers were arrested and harassed for having reported the torture of British prisoners.

How have we gotten to the place where the revelation of torture is no longer laudable whistle-blowing, but now counts as espionage?

The answer is that government has not yet come to terms with the persistence and transparency of the digital age. Information moves so fast and to so many places that controlling it is no longer an option. Every datapoint, whether a perverted tweet by an aspiring mayor or a classified video of Reuters news staffers being gunned down by an Apache helicopter, will somehow find the light of day. It's enough to make any administration tremble, but it's particularly traumatic for one with things to hide.

That's why they tried to throw the book, and then some, at Manning.

Prosecutors cast simple Internet commands known to any halfway literate Internet user (or anyone who used the Internet back in the early '90s) as clandestine codes used only by hackers to steal data. That Osama bin Laden could download these files off the Wikileaks website (along with millions of other people) became justification for classifying the whistle-blowing as espionage, an act of war. And Manning is just one of a record seven Americanscharged with violating the Espionage Act in a single administration.

But prosecuting those whose keyboards or USB sticks may have been technically responsible for the revelations is futile. The more networked we become and the more data we collect, the more likely something will eventually find its way out. After all, a security culture based on surveillance and big data cuts both ways.

Moreover, harsh reaction to digital whistle-blowers only increases the greater population's suspicions that more information is being hidden.

In this one leaking incident, Manning exposed allegations of tortureundisclosed civilian death tolls in Afghanistan and Iraq, official orders not to investigate torture by nations holding our prisoners, accusations of the torture of Spanish prisoners at Guantanamo, the "collateral murder" video of Reuters journalists and Iraqi civilians as U.S. soldiers cheered, U.S. State Department support of corporations opposing Haitian minimum wagetraining of Egyptian torturers by the FBI in Quantico, Virginia, U.S. authorized stealing of U.N. Secretary General's DNA -- the list goes on.

These are not launch codes for nuclear strikes, operational secrets or even plans for future military missions. Rather, they are documentation of past activity and officially sanctioned military and state policy. These are not our secrets, but our ongoing actions and approaches.

A thinking government--a virtuous one, if we can still use such a word--would treat this as a necessary intervention. Things have gone too far. But ours is a government in "present shock": an always-on, always-connected population puts the administration in a state of perpetual emergency interruption. It's not the phone call at 2 a.m. for which a president has to be prepared, but the tweet at 3, the Facebook update at 4, the YouTube video at 5, and on and on.

In such a crisis-to-crisis landscape, there's no time to implement or even articulate a "grand narrative." A real-time, digital world offers no sense of mission or opportunity to tell a story. There's no Cold War to win. No moon shot to work toward. There are just emergent threats, one after the other after the other. Things just exist in the present, one tweet - or, actually, many tweets - at a time.

This makes it exceedingly difficult to frame our policies and strategies with language and purpose. It's no longer a matter of walking the talk. Without the talk, there's only the walk. We have no way of judging the ethics and intentions of our government except by what it actually does.

Combine this with the transparency that comes with digital technology and our leaders simply have no choice but to do the right thing. It takes more energy to prevent exposure than simply to behave consistently with the values we want to project.

Just as corporations are learning that they can no longer maintain low prices through overseas slave labor without getting caught, a democratic government can no longer maintain security through torture and coercion without being exposed. Betraying our respect for human dignity only makes us less resolved as a people, and less trusted as a nation.

We are just beginning to learn what makes a free people secure in a digital age. It really is different. The Cold War was an era of paper records, locked vaults and state secrets, for which a cloak-and-dagger mindset may have been appropriate. In a digital environment, our security comes not from our ability to keep our secrets but rather our ability to live our truth.


Rushkoff at Harvard: Present Shock, Dave Weinberger and Game of Thrones

Dave Weinberger led this discussion with me at Berkman Center at Harvard Law School, that ended up moving from Present Shock to the digital economy to Game of Thrones and beyond. 

Page 1 ... 5 6 7 8 9 ... 153 Next 5 Entries »


Book Business Katinka Matson
The Brockman Agency
Media Inquiries media[at]rushkoff[dot]com
Talks talks[at]rushkoff[dot]com
Personal rushkoff[at]
All Else contact[at]rushkoff[dot]com



Follow @rushkoff on Twitter.




Design by AMY E. MARTIN