about contact us

No posts containing your search terms were found.

Your search did not match any documents.

Suggestions:

  • * Make sure all words are spelled correctly.

  • * Try different keywords.

  • * Try more general keywords.

What Ritz-Carlton Can Teach Us

. Fred Leise

Two posts in a row on management!

As part of my management training "readings," I've listened to a presentation given by Horst Shulze, former CEO of Ritz-Carlton. He's got some interesting things to say about what organizations need to do to improve how they work.

"Elimination of defects means becoming efficient. Cutting costs is not efficiency."

"If you hide a mistake, you can't learn from it...A mistake that happens more than once is a procedural defect."

"To hire people only to fulfill a function is immoral."

He also spends a lot of time on hiring the right people and the importance of orienting them to core company values, training them, then reminding them daily of those core values.

Labels:

ROWE

. Iga

Today's post is about a subject that is highly controversial in the corporate world, but something I am personally very interested in, not as in "yey, let's try it immediately", but rather the hard results it can potentially bring.

The topic is ROWE - or results only work environment.

Nearly all aspects of our lives have been affected by the changing technology from health care to communication. However when it comes to our working environment we are stuck in the 50's where manufacturing industry influenced the 9 to 5 work schedule. This model has prevailed over the years unchallenged, even though our workloads have expanded beyond 40 hours per week, and due to technological advancements we can now carry our work (laptops, blackberries) home, which we readily do. However, many researchers in the field noticed that having a body in the chair for 8 hours a day doesn't equate productivity or results. Being at work doesn't always mean people are working.

That was the genesis of one of the most radical innovation in the workplace: results only work environment (ROWE). Nationwide, 3 percent of businesses now say they have a ROWE, including Best Buy, Gap, Inc., Verizon, IBM to name a few. Evidence shows that teams who have adopted ROWE see productivity rise by 41 percent on average.

Results-Only Work Environment is a management strategy where employees are evaluated on performance, not presence. In a ROWE, people focus on results and only results ­ increasing the organization's performance while creating the right climate for people to manage all the demands in their lives . . . including work.

Because in ROWE results need to be clearly defined (since you cannot evaluate a person based on how many hours a day they spend in the office), managers need to clearly state the tasks and deadlines to their teams. It was observed that the teamwork, morale and engagement soared and led to less workers feeling overworked, stressed out or guilty. People were where they needed to be, when they need to be, ­they didn't need schedules.

The most controversial aspect, however, was that in ROWE meetings become optional. For many organizations that became the tipping point to back out of the program. But those who decided to give it a try, found out that regardless whether someone decided to opt out of the meeting, all staff was responsible for what happens in meetings. Huge attention was given to 'worthiness' of such gatherings and very quickly people realized how many hours have been previously wasted in unnecessary meetings. According to Jason Fried from 37 Signals and his recent book 'Rework' he claims that meetings are waste of time and expensive. If you have 5 people in a meeting for 1 hour, you are loosing 5 hours, not to mention interruption. And how many managers and directors spend entire days in meetings? How productive are they? How much are they not doing by sitting in those meetings? How many times are they forced to stay late to 'catch up'?

ROWE for Employees

ROWE recognizes that life is an individual experience and that no two lives are identical, and leverages this to achieve better performance from each individual. ROWE is not Flextime or Telecommuting or Job-Sharing, and is not about allowing your people to work from home a couple of days per week. ROWE is about cutting out from your day what doesn't drive the results, like being stuck in traffic every day for 3 hours. You make the decisions about what you do and where you do it, every minute of every day.

ROWE benefits:

* You control the clock and results are your responsibility

* Healthier lifestyle ­ not overworked, less stress

* Autonomy & accountability

* Environmentally friendly, save on the commute and work from home!

ROWE for Business

Successfully adopting a Results-Only Work Environment will position your company to attract and retain talent that will show up energized, disciplined, flexible and focused, ready to deliver all results necessary to drive the business. A ROWE workforce is more efficient, productive and loyal to the organization while also feeling satisfied, fulfilled, and in control of their personal and professional lives. Management can spend less time monitoring and focus their energy on the business and team building (yes!).

ROWE business results:

* Increase productivity & efficiency

* Talent retention & attraction

* Optimization of space

* Elimination of wasteful processes

It is obvious that not all types of business can benefit and adapt to ROWE. ­ Schools, hospitals, airline business, manufacturing better stay as they are. However, most corporate and other white collar professions, where creative thinking and problem solving is key performance objective, ROWE ­ with the right attitudes from management - has a potential to make them thrive. Best Buy Co. implemented ROWE in 2006 and reported an average of 35% increase in productivity (within teams that were affected). If a company is striving for survival in the economic downturn, like Best Buy's direct competitor - Circuit City once was, it is a huge advantage to have that much higher level of productivity and lower operational costs.

Sources: http://www.hrmreport.com/ http://gorowe.com/know-rowe/what-is-rowe/ http://www.npr.org/templates/story/story.php?storyId=124705801

Labels: ,

Connected Organization

. Iga

This short video, from Kevin Wheeler, is about how to structure a company to achieve success. According to him the key factor is to transform an organization for a siloed (sp?), top-down structure to a networked organization where communication thrives. A structure like that can deliver a product that offers Complexity, Interdependence, and Innovation. I think I agree with him.

When getting a grilled cheese is impossible, we all lose

. Pete

Today is Crepes Friday.

Most days, I work in downtown Chicago. SHC has offices in a couple buildings there, but no cafeteria facilities. This isn't a problem because there's a multitude of places to eat within an iPad's throw of the office. I could probably find a place that made a great grilled cheese, but in a nerdy way I reserve that particular pleasure for that once-a-week trip out to our suburban campus.

When I roll into Hoffman Estates, lunch is always whatever soup catches my eye, some kind of not-good-for-me Starbucks cold drink, and a grilled cheese. White bread, american cheese, a few strips of bacon, and a little love. I try not to actively abuse my body with the things I eat, but I am powerfully drawn in by the cheese/bacon combo. It was with such lust in my heart that I started out for lunch today. We have a large campus here in “The Hoff”, so you plan your trips.

The food here is very good. Lots of choices, happy people, and several locations sprinkled throughout the campus. Fresh from several meetings and fueled with a powerful appetite, I started the sojourn to what we call The Small Cafe, home of my beloved grilled cheese. The Small Cafe is miles closer to the UX department than The Big Cafe is.

Imagine my shock when I finally arrived and learned that today was “Crepes Friday!” at The Small Cafe.

I'm an adult, and remained calm. I'm also flexible, and because of my job I have to roll with things and make compromises in the face of adversity. Intrepid, I approached the counter. It was late, so most of the lunch crowd was here and gone, all creped out, I suppose. "Can I get a grilled cheese? With bacon?" I asked, all smiles.

"Today is Crepes Friday! Only Crepes today." The café employee replied.

My brow knit. Maybe even furled.

"I see the grill right there. Can I get a grilled cheese?" In a show of instant disloyalty to my convictions, I had ditched the idea of bacon, eager to get at least a basic grilled cheese. The friendly employee looked a little taken aback, told me to wait just a moment, and left the line, turning around the corner. I also stepped to the side and in moments the very friendly Cafe manager was with me.

"Only Crepes today," the manager said, beaming.

I looked over her shoulder at the grill and fully-stocked bread rack. "Can I get a grilled cheese? I'm looking right at the grill and the bread." I said this in my best helpful tone, not being a jerk.

"I'm really sorry. Today is Crepes Friday. Only crepes today. You can go the Big Cafe and get a grilled cheese, if you'd like."

In fact, I could. But now I didn't want one. I dis-liked, in fact.

I was a silly, stupid customer. I knew what I wanted, could plainly see that they had what I wanted, and desperately wanted to buy from them. To keep a loyal customer all they had to do was throw a couple pieces of Kraft Select between two slices of bread on a grill, all within arms reach. It was tough for them to take my money. It was against the rules. It was Crepes Friday.

No soup for you, Pete Simon. Or grilled cheese.

The thing is, I wonder how often we do this to our customers. I wonder how often we wear blinders, and miss an opportunity to let people give us their money. I wonder how often a devotion to some convention or easy path causes us to lose a customer who was in our store or on our site, money in hand.

I don't have any control over the menu at The Small Cafe. I do have some influence over experience at catalog.sears.com , and I can only hope that I do a good job of making the design open, approachable, and adoptable. If someone comes to me willing to engage, I want to do everything I can to take care of them, and nurture their interest.

Or their love of grilled cheese. With bacon.

Labels: , , ,

Social Dev Camp Chicago

. Dennis Schleicher



Pete Simon talking at Social Dev Camp Chicago on ""Why do I even care?" Designing For the Skeptical User & For Your Community Growth"

Connectile Dysfunction

. Dennis Schleicher

Connectile Dysfunction by Mark Baskinger

A great read in UX Magazine. Almost a must read if you are looking into how interaction design can feed into industrial design

Key concepts
  • Design for Impact and Design for Experience
  • Refrigerator- design of an organizational system for cold storage
  • Boomer and Elderly similarities and differences
  • Co-design
  • Steering Wheels as insight for stove knobs (unified product forms and interactions)
  • Strike Zones
Big Themes
  • Integrating interaction into form
  • Situated interaction in environmental context
  • Express through physical/visual form
  • Narrating the interaction
.

You Are Not a Gadget by Jaron Lanier Thoughts and Reflections

. wandereye

It didn't take me very long to read this book as it was like listening to someone articulate many of the issues and concerns floating through my subconscious for the last 15 years. Jaron is referred to as the "Godfather of Virtual Reality" and a very loud voice for what he considers a true "fight for the human spirit" in an age of massive technological innovation and disruption.
Jaron will hate my paraphrasing, the slicing and dicing of extractions from his book, citing his treatises about fragmented knowledge and the promotion of shallow understanding when doing so. Though I agree with him in many respects, where I part ways is when I think about the perils of generalization. In other words, if someone (like myself) reads the entire text as it was meant to be consumed (linearly, sequentially) and then extracted the points of interest, I would not consider this act detrimental to the intent of the author; nor the benefit of the reader in terms of knowledge transfer. If my way of digesting this turgid and massive text about highly abstract social technological issues is to highlight and revisit to extract - which aids in memory and internalization - I fail to see how every case of chunked extraction promotes ADD. Where it may have a detrimental effect is when you, the reader of this blog post, bipass reading his book as linear text (Jenny speaks of "codex"), taking what my interpretations are at face value, don't ever read the source material. As Benjamain speaks to in "The Work of Art in the Age of Mechanical Reproduction", the information is diluted the further it travels from its orgins, the aura is somewhat lost.

Excerpts from the book will be blockquoted with my comments following:
The words in this book are written for people, not computers... You have to be somebody before you share yourself. ix

Kurzweil would argue the above. By 2046, he says we'll be one with computers and technology. Therefore, computers will be "somebody" by then, if not in limited ways now. Turing tests are another counter to this statement. How would the book know it was being read by a human vs a computer? Books are ONE WAY communication nodes. 
Speech is the mirror of the soul; as a man speaks, so is he. — publilus syrus

Articulation is tricky. To be able to verbalize is a skill learned over time through several influences including culture, evolution, etc. Non-verbal communication seems to be the major breaking point in our current efforts to understand customers.  I would rephrase this to say "You are what you do, not what you say you do."

[web 2.0] promotes radical freedom on the surface of the web, but that freedom ironically, is more for machines than people. p.3

This is Jaron's introduction to "lock in" where computers define the design constraints as opposed to responding to them. Web 2.0, in favor of some "back end" capabilities as well as enhancements to hardware and channels to move information, has helped design and user experience take a large step back in favor of functionality over form. An entire design vernacular has been introduced and followed by flock-like mentality within the industry. Web 2.0 seems to have driven a wedge between an already widening gap between designer and developer by there mere fact that the markup and languages and systems are evolving quickly enough to warrant specialization. 

It is impossible to work with information technology without also engaging I social engineering. p.4

Communications influences. You can't erase it or take it back. Virtual or non-virtual, time keeps on ticking away. When any human "uses" something, s/he/it is being manipulated and exploited, guided through a taxonomy or construct. The internet has never not been social. It was created for human beings to share information via a syntax (markup) via a network scheme. Sharon Poggenpahl was right on ten years ago when she told me "designers of the future will not be stylists but will be designing frameworks and systems that leverage patterns." I see the transition from Web 2.0 introducing an enduring concept that has become somewhat of a mantra of late "content is king". The medium will not be the message (right now it is).

Different media designs stimulate different potentials in human nature. We shouldn't seek to make the pack mentality as efficient as possible. We should instead seek to inspire the phenomenon of individual intelligence. p.5

Individual intelligence comes from empathic connection and engagement with objects and others. I too am concerned with the "pack mentality" found throughout the world of Web 2.0. Sure, conformity makes things easier in terms of management and adoption. But I don't think we're far enough into the evolution of our systems to warrant the abandonment of trying new things. Still, tribe and relationships are human nature in the span of time with or without computers (again, the distinction between on and offline is blurring). 
Being a person is not a pat formula, but a quest, a mystery, a leap of faith. p.5
Yes, being a person is trial and error and learning and growing. To what extend there is a will to be an individual... that's another story all together. 
We make up extensions to your being, like remote eyes and ears (webcams and mobile phones) and expanded memory (the world of details you can search for online). These become the structures by which you connect to the world and other people. These structures in turn can change how you conceive of yourself and the world. We tinker with your philosophy by direct manipulation of your cognitive experience, not indirectly, through argument. It takes only a tiny group of engineers to create technology that can shape the entire future of human experience with incredible speed. Therefore, crucial arguments about the human relationship with technology should take place between developers and users before such direct manipulations are designed. p.6
This makes me think about libraries and the difference between a physical repository of credited and credible information vs. complete and total trust of a hyper-anonymous ethersphere. What scares me about digital print is the opportunity for revisionism. Jaron goes deeper when stating the above hinting at the influences the interfaces themselves have on human cognition and physical manipulation. The unintended consequences will become more apparent as the technology evolves at exponential rates of change faster than anything anyone alive today can fathom (save for people like Jaron and Kurzweil, et al).
There is a constant confusion between real and ideal computers. p.6
The brittle character of maturing computer programs can cause digital designs to get frozen into place by a process known as lock-in. This happens when many software programs are designed to work with an existing one. p.7

The unintended consequence of lock-in is felt acutely in large organizations with enormous "legacy" issues on their "backends" or "middlewear" systems. The cost/benefit equation is used to justify a lack of upgrading at the expense of the customer or the business in terms of limitations or poor experience offerings. The greatest risks are not to the systems themselves but to the cultures, the people and processes that rely on them. Over time, this lock-in can lead to lapses of vision, perspective, or even the ability to survive in the marketplace. 
Software is worse that railroads because it must always adhere with absolute perfection to a boundlessly particular, arbitrary, tangled, intractable messiness. p.8
The process of lock-in is like a wave gradually washing over the rulebook of life, culling the ambiguities of flexible thoughts as more and more thought structures are solidified into effectively permanent reality.
The philosopher Karl Popper was correct when he claimed that science is a process that disqualifies thoughts as it proceeds... p.9

Makes me think of that quote hanging at my desk:

"To define is to kill. To suggest is to create." — Stephane Mallarmé

Validity vs analitically based thinking is an age old "friction" between "design" and "business" or "art" and "science" etc... Some processes like to use past data to project future trends, a process I've heard referred to as "driving forward while looking in the rear-view mirror". Art likes to try stuff out, fail early, refine, try again, and is resistent to the quantifiable modelling of analysis in the empiracle or traditional sense. When change in the marketplace was not exponential, analytical thinking (data-based) had a glimmer of hope and relevance. Now, as we are seeing exponential change, validity based thinking will be more the norm (in successful organizations). As some people in the industry have seen, we've transitioned from an economy of scale to an economy of choice.

Lock-in, however, removes design options based on what is easiest to program, what is politically feasible, what is fashionable, or what is created by chance.
If it's important to find the edge of mystery, to ponder the things that can't quite be defined—or rendered into a digital standard—then we will have to perpetually seek out entirely new ideas and objects, abandoning old ones like musical notes... I'll explore whether people are becoming like MIDI notes—overly defined, and restricted to what can be represented by a computer. p.10

The above is the central argument to his book. I once tried to present in 1999 a concept I was working on about computer generated music. I predicted in the presentation, based on the research done by many AI people on player pianos (also a great book by Vonnegut), that within the decade we would have access at the consumer level to software that would allow us to not only compose "MIDI" music but truly incorporate the nuances of tremelo or sustain, tonality, color, tempo, even human error or deviances within a performance. I was laughed at, walked away with my tail between the legs but redeeemed the second Apple released garage band. But that was almost 10 years later and all the people in the room most-likely forgot my weak presentation. 

The human organism, meanwhile, is based on continuous sensory, cognitive, and motor processes that have to be synchronized in time. UNIX expresses too large a belief in discrete abstract symbols and not enough of a belief in temporal, continuous, non abstract reality... p.11
The ideas expressed by the file include the notion that human expression comes in severable chunks that can be organized as leaves on an abstract tree—and that the chunks have versions and need to be matched to compatible applications. p.13

"network effect." Every element in the system—every computer, every person, every bit—comes to depend on relentlessly detailed adherence to a common standard, a common point of exchange. p.15

The central mistake of recent digital culture is to chop up a network of individuals so finely that you end up with a mush. You then start to care about the abstraction of the network more than the real people who are networked, even though the network by itself is meaningless. Only the people were ever meaningful. p.17
humanism in computer science doesn't seem to correlate with any particular cultural style.p.17
the web 2.0 designs actively demand that people define themselves downward. p.19
Again, see the Mallarmé quote.
• Emphasizing the crowd means deemphasizing the individual humans in the design of society, and when you ask people not to be people, they revert to bad moblike behaviors. This leads not only to empowered trolls, but to a generally unfriendly and unconstructive online world.
• Finance was transformed by computing clouds. Success in finance became increasingly about manipulating the cloud at the expense of sound financial principles.
• There are proposals to transform the conduct of science along similar lines. Scientists would then understand less of what they do.
• Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action.
• Spirituality is committing suicide. Consciousness is attempting to will itself out of existence.
p.19-20
Someone who has been immersed in orthodoxy needs to experience a figure-ground reversal in order to gain perspective. p.23
The Rapture and the Singularity share one thing in common: they can never be verified by the living. p.26

I think Kurzweil was speaking of Singularity in the sense of a merger; not one or the other. I'll have to check on that. 

A computer isn't even there unless a person experiences it. p.26

Guns are real in a way that computers are not. p.27
The first tenet of this new culture [Silicon Valley, et al, sic] is that all of reality, including humans, is one big information system.p.28
...it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves. p.28
I say that information doesn't deserve to be free... What if it's even less inanimate, a mere artifact of human thought? What if only humans are real, and information is not?... there is a technical use of the term "information" that refers to something entirely real. That is the kind of information that is related to entropy... Information is alienated experience. p.28

Experience is the only process that can de-alienate information. p.29

What Kurzweil refers to as the utility of data used as information. And then there's that super dense black hole conversation about knowledge vs information vs data... 

What the [Turing] test really shows us, however, even if it's not necessarily what Turning hoped it would say, is that machine intelligence can only be known in a relative sense, in the eyes of a human beholder. p.31
Chess and computers are both direct descendants of the violence that drives evolution in the natural world. p.33
If that is true, then the objective in chess is to make moves that promote more moves for yourself while limiting the options of the opponent. Which would lead someone to refer to the "violence" as more of a disruption or challenge rather than some harmful attack. Unless, of course, the chess game is real survival. But that is for the movies. 
In order for a computer to beat the human chess champion, two kinds of progress had to converge: an increase in raw hardware power and an improvement in the sophistication and clarity with which the decisions of chess play are represented in software. p.34

When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful. p.36

Consciousness is situated in time, because you can't experience a lack of time, and you can't experience the future. p.42
Isn't the only way to have a future or a now to have a past? In the case of amnesia... I forgot what I was going to write...
people are encouraged by the economics of free content, crowd dynamics, and lord aggregators to serve up fragments instead of considered whole expressions or arguments. p.47
Yeah. Because we (the consumers and workers, etc) received more access to a wider and deeper range of content in mulitiplied contexts. What's the difference between a card catelogue at a library and a feed aggregator? Little in terms of the "codex" or format. There is an arrangement and structure, and degree of access to information about objects or cards... Since when did we get whole expressions or arguments when engaging with the "media"? For people outside the world of "nerd", computers are largely entertainment centers.
The only hope for social networking sites from a business point of view is for a magic formula to appear in which some method of violating privacy and dignity becomes acceptable. p.55
The value of a tool is its usefulness in accomplishing a task. p.59
If we are to continue to focus the powers of digital technology on the project of making human affairs less personal and more collective, then we ought to consider how that project might interact with human nature. p.62

FUD—feat, uncertainty, doubt. p.67

Information systems need to have information in order to run, but information underrepresents reality. p.69

What computerized analysis of all the country's school test has done to education is exactly what Facebook has done to friendships. In both cases, life is turned into a database. p.69

The places that work online always turn out to be the beloved projects of individuals, not the automated aggregations of the cloud. p.72

Yeah, but these "individuals" have relationships of opportunity and influence with other people. If they are in a cloud or if they are in a cubicle. This kind of innovation don't happen in a vacuum.

The deep design mystery of how to organize and present multiple threads of conversation on a screen remains as unsolved as ever. p.72

It's the people who make the forum, not the software. p.72
once you have the basics of a given technological leap in place, it's always important to step back and focus on the people for a while. p.72
People will focus on activities other than fighting and killing one another only so long as technologists continue to come up with ways to improve living standards for everyone at once. p.80

If money is flowing to advertising instead of musicians, journalists, and artists, then a society is more concerned with manipulation that truth or beauty. If content is worthless, then people will start to become empty-headed and contentless. p.83
Which usually leads to a backlash of "authentic" expression in societies as some art historians would say this is a cyclical pattern in "post-capitalist" societies. 

The limitations of organic human memory and calculation used to put a cap on the intricacies of self-delusion. p.96

There are so many layers of abstraction between the new kind of elite investor and actual events on the ground that the investor no longer has any concept of what is actually being done as a result of investments. p.97

Each layer of digital abstraction, no matter how well it is crafted, contributes some degree of error and obfuscation. No abstraction corresponds to reality perfectly. A lot of such layers become a system unto themselves, one that functions apart from the reality that is obscured far below. p.97

Locks are only amulets of inconvenience that remind us of a social contract we ultimately benefit from. p.107

Economics is about how to best mix a set of rules we cannot change with rules that we can change. p.112

The economy is a tool, and there's no reason it has to be as open and wild as the many open and wild things of our experience. But it also doesn't have to be as tied down as some might want. It should and could have an intermediate level of complexity. p.117

cybernetic totalism will ultimately be bad for spirituality, morality, and business. In my view, people have often respected bits too much, resulting in a creeping degradation of their own qualities as human beings. p.119

And if you look at the evolution of the technology closely, the "big ticket" technology bits items seem to be about expression or capture or passive viewing of the human story (TVs, cameras, music, games, etc). So again, Kurzweil may be onto something when he speaks of convergence... We're using VR and gesture and voice to augment the normally tactile activities in our lives so we can spend more time playing, no? 

Ideal computers can be experienced when you write a small program. They seem to offer infinite possibilities and an extraordinary sense of freedom. Real computers are experienced when we deal with large programs. They can trap us in tangles of code and make us slaves to legacy. p.119

If each cultural expression is a brand-new tiny program, then they are all aligned on the same starting line. Each one is created using the same resources as every other one. p.120

That's one reason web 2.0 designs strongly favor flatness in cultural expression. p.120

Let's suppose that back in the 1980s I had said, "In a quarter century, when the digital revolution has made great progress and computer chips are millions of times faster than they are now, humanity will finally win the prize of being able to write a new encyclopedia and a new version of UNIX!" It would have sounded utterly pathetic. p.122
Welcome to my world. We've seen it all coming for a while, back in the 1950's there was the Jetpack stuff and Jetsons etc. It's like we're bracing ourselves. Somewhere along the way we forgot to think about the social impacts and emotional impacts of technological disruption and innovation and change on such rapid scales and at such rapid paces. 
The distinction between first-order expression and derivative expression is lost on true believers in the hive. First-order expression is when someone presents a whole, a work that integrates its own worldview and aesthetic. It is something genuinely new in the world. Second-order expression is made of fragmentary reactions to first order expression. p.122
Only people can make schlock, after all. A bird can't be schlocky when it sings, but a person can. p.123
I've seen computers make some SERIOUS schlock. I mean, SERIOUS. See Makers by Cory Doctorow.
The decentralized nature of architecture makes it almost impossible to track the nature of the information that is flowing through it. p.123
In more recent eras, ideologies related to privacy and anonymity joined a fascination with emerging systems similar to some conceptions of biological evolution to influence engineers to reinforce the opacity of the design of the internet. Each new layer of code has furthered the cause of deliberate obscurity. p.124

The appeal of deliberate obscurity is an interesting anthropological question... One is a desire to see the internet come alive as a metaorganism: many engineers hope for this eventually, and mystifying the workings of the net makes it easier to imagine it is happening. There is also a revolutionary fantasy: engineers sometimes pretend they are assailing a corrupt existing media order and demand both the covering of tracks and anonymity from all involved in order to enhance this fantasy... the result is that we must now measure the internet as if it were are part of nature, instead of from the inside, as if we were examining books of a financial enterprise. p.124

Some of the youngest, brightest minds have been trapped in a 1970s intellectual framework because they are hypnotized into accepting old software designs as if they were facts of nature. p.126

pattern exhaustion, a phenomena in which a culture runs out of variations of traditional designs i their pottery and becomes less creative. p.128

Spore addresses an ancient conundrum about causality and deities that was far less expressibly before the advent of computers. It shows that digital simulation can explore ideas in the form of direct experiences, which was impossible with previous art forms.p.132

A HYPOTHESIS LINKS the anomaly in popular music to the characteristics of flat information networks that suppress local contexts in favor global ones. p.133

A digital image of an oil painting is forever a representation not a real thing. p.133

The definition of a digital object is based on assumptions of what aspects of it will turn out to be important. It will be a flat, mute nothing if you ask something of it that exceeds expectations. p.134

Hip-hop is imprisoned within digital tools like the rest of us. But at least it bangs fiercely against the walls of its confinement. p.135

The hive ideology robs musicians and other creative people of the ability to influence the context within which their expressions are perceived, if they are to transition out of the old world of labels and music licensing. p.136

Every artist tries to foresee or even nudge the context in which expression is to be perceived so that the art will make sense. It's not necessarily a matter of overarching ego, or manipulative promotion, but a simple desire for meaning. p.137

Even if a video of a song is seen a million times, it becomes just one dot in a vast pointillist spew of similar songs when it is robbed of its motivating context. Numerical popularity doesn't correlate with intensity of connection in the cloud. p.137

If you grind any information structure up too finely, you can loose the connections of the parts to their local contexts as experienced by the humans who originated them, rendering the structure itself meaningless. p.138

There are two primary strands of cybernetic totalism. In one strand, the computing cloud is supposed to get smart to a superhuman degree on its own, and in the other, a crowd of people connected to the cloud through anonymous, fragementary contact is supposed to the super-human entity that gets smart. p.139

Once organisms became encapsulated, they isolated themselves into distinct species, trading genes only with others of their kind. p.140

you'll generally find for most topics, the Wikipedia entry is the first URL returned by the search engines but not necessarily the best URL returned by search engines but not necessarily the best URL available. p.143
One of the negative aspects of Wikipedia is this: because of how its entities are created, the process can result in a softening of ambition or, more specifically, a substitution of ideology for achievement. p.143

The distinction between understanding and creed, between science and ethics, is subtle. p.151

computationalism. This term is usually used more narrowly to describe a philosophy of mind, but I'll extend it to include something like a culture... the world can be understood as a computational process, with people as subprocesses. p.153

My first priority must be to avoid reducing people to mere devices. The best way to do that is to believe that the gadgets I can provide are inherent tools and are only useful because people have the magical ability to communicate meaning through them. p.154

The whole point of technology, though, is to change the human situation, so it is absurd for humans to aspire to be inconsequential. p.155
Logical positivism is the idea that a sentence or another fragment—something you can put in a computer file—means something in a freestanding way that doesn't require invoking the subjectivity of a human reader... "The meaning of a sentence is the instructions to verify it."... The new version of the idea if that if you have a lot of data you can make logical positivism work on a large-scale statistical basis. The thinking goes that within the cloud there will be no need for the numinous halves of traditional oppositions such as syntax/semantics, quantity/quality, content/context, and knowledge/wisdom. p.155
"realism." The idea is that humans, considered as information systems, weren't designed yesterday, and are not the abstract playthings of some higher being, such as a web 2.0 programmer in the sky or a cosmic spore player. Instead, I believe that humans are the result of billions of years of implicit, evolutionary study in the school of hard knocks. The cybernetic structure of a person has been refined by a very large, very long, and very deep encounter with physical reality... what can make bits have meaning is that their patterns have been hewn out of so many encounters with reality that they aren't really abstractable bits anymore, but are instead a non-abstract continuation of reality... Realism is based on specifics, but we don't yet know—and might never know—the specifics of personhood from a computational point of view. The best we can do right now is engage in the kind of storytelling that evolutionary biologists sometimes indulge in.   p.157
Fourier Transform. A Fourier transform detects how much action there is at particular "speeds" (frequencies) in a block of digital information. p.161

Gabor wavelet transform... This mathematical process identifies individual blips of action at particular frequencies in particular places, while the Fourier transform jest tells you what frequencies are present overall. p.161

Odors are completely different, as in the brain's method of sensing them. p.162
The number of distinct odors is limited only by the number of olfactory receptors capable of interacting with them. p.163

There is no way to interpolate between two smell molecules... colors and sounds can be measured with rulers, but odors must be looked up in a dictionary. p.163

smelly chemicals... are tied to the many stages of rotting or ripening of organic materials. As it turns out, there are three major, distinct chemical paths of rotting, each of which appears to define a different stream of entries in the brain's dictionary of smells. p.164

A smell is a synecdoche: a part standing in for them whole. p.164

Olfaction, like language, is built up from entries in a catalog, not from infinitely morphable patterns... the grammar of language is primarily a way of fitting those dictionary words in a larger context. p.165

This is perhaps the most interesting take away from the book. The olfactory as a medium, as a sense, as a channel. 

Darwin's most compelling evolutionary speculations was that music might have preceded language. He was intrigued by the fact that many species use song for sexual display and wondered if human vocalizations might have started out that way too. It might follow, then, that vocalizations could have become varied and complex only later, perhaps when song came to represent actions beyond mating and such basics of survival. p.167

The brain's cerebral cortex areas are specialized for particular sensory systems, such as vision. There are also overlapping regions between these parts—the cross-modal areas I mentioned earlier in connection with olfaction. Rama [V.S. Ramachandran] is interested in determining how the cross-modal areas of the brain may give rise to a core element of language and meaning: the metaphor. p.171

conflict that has been at the heart of information science since its inception: Can meaning be described compactly and precisely, or is it something that can emerge only in approximate form based on statistical associations between large numbers of components? p.173

when you deny the specialness of personhood, you elicit confused, inferior results from people. p.177

Separation anxiety is assuaged by constant connection. p.180

software development doesn't necessarily speed up in sync with improvements in hardware. It often instead slows down as computers get bigger because there are more opportunities for errors in bigger programs. Development becomes slower and more conservative when there is more at stake, and that's what is happening. p.181

Some of the greatest speculative investments in human history continue to converge on silly Silicon Valley schemes that seem to have been named by Dr. Seuss. On any given day, one might hear of tens or hundreds or millions of dollars flowing to a start-up company named Ublibudly or MeTickly. These are names I just made up, but they would make great venture capital bait if they existed. At these companies one finds rooms full of MIT PhD engineers not seeking cancer cures or sources of safe drinking water for the underdeveloped world but schemes to send little digital pictures of teddy bears and dragons between adult members of social networks. At the end of the road of the pursuit of technological sophistication appears to lie a playhouse in which humankind regresses to nursery school. p.182

Yes, I agree whole-heartedly that "social networking" is in its infancy—especially when you approach it from a purely technological viewpoint, as we tend to do in every industry that touches a machine or uses one as a mediation device. If we ditch the computer when thinking about these interactions, we'll find there are several disciplines, both professional and academic that have been dealing with many of the issues inherent with social networking on the internet.

For more information about Jaron Lanier, see his website: http://www.jaronlanier.com/

Labels: , , , , , , , , , , , ,

Online Profiles & the Movie "Salt"

. Dennis Schleicher

The movie Salt gives us insight into deep culture's conception of how many profiles one person can have.

Observation: In the movie Salt and in other spy movies we have seen a move from a single profile of a spy (James Bond) to a multiple profile spy (Salt) in which Salt is a spy, a counter-spy, and a counter-counter-spy. People nowadays seem to have no problem following this plot line and character (the wilderness of mirrors.) This is addition to her 2 "cover" profiles which would exist in each national culture.

Observation: In high schools now kids are no longer exclusively nerds, jocks, stoners, etc. Each of these can be short-lived roles even within a single day.

Observation: Facebook and Linked in are used by the same people in very different ways.

The Question: How many different profiles can a person have? And, how many different profiles of 1 person can other people keep track of?

Answer: I think the answer is 3 to 5. We see Facebook, LinkedIn, and usually peoples corporate in-company profile, and their personal profile (usually mediated through the channels of email/phone-calls/text messages. From the movie Salt we see 5. It is without a doubt more than 1.

References

Chris Messina talked about people having 5 profiles in NY at the Overlap event.

Why spies, counter-spies, and counter-counter spies are so popular right now.

The Official Site for the Movie Salt

Comments on high-school groups (I'm still looking for my reference)

See Also

Shoptimism