Chris Nuzum Hyperkult XXV Video | Tripping Up Memory Lane

May 15, 2016 · · Posted by Greg Lloyd

ImageWatch this video of Chris Nuzum's Tripping Up Memory Lane talk at Hyperkult 2015, University of Lüneburg, 10 July 2015. Traction Software CTO and co-founder Chris Nuzum reviews hypertext history, his experience as a hypertext practitioner, and the core principles of Traction TeamPage.

Image
Live video Christopher Nuzum: Tripping up Memory Lane Hyperkult XXV
Adobe Flash required for desktop Chrome or Internet Explorer 10 and earlier.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

Tripping Up Memory Lane - Chris Nuzum's written notes for his Hyperkult XXV talk.

Traction Roots - Doug Engelbart - About Doug Engelbart's Journal and Traction.

Original Traction Product Proposal - Hypertext roots and evolution of Traction TeamPage.

Doug Engelbart | 85th Birthday Jan 30, 2010 - "Doug Engelbart sat under a twenty-two-foot-high video screen, "dealing lightning with both hands." At least that's the way it seemed to Chuck Thacker, a young Xerox PARC computer designer who was later shown a video of the demonstration that changed the course of the computer world."

Thought Vectors - Ted Nelson: Art not Technology - "To give up on human understanding is to give up hope, what we call in English 'a counsel of despair.' I think there is hope for much better and more powerful software designs that will give ordinary people the power over computers that they have always wanted - power with complete understanding. But that requires inspired software design, which I believe is art and not technology."

The Work Graph Model: TeamPage style - The social dance of getting things done, dealing with exceptions, and staying aware of what’s going on around you

Original Traction Product Proposal

August 24, 2015 · · Posted by Greg Lloyd

Image
I hope you'll enjoy reading the original Traction Product Proposal, dated October 1997. Many early Traction concepts carried over directly to the Teampage product first commercially released in July 2002, but we've also learned a lot since then - as you might hope! The quotes still make me smile. The Proposal and Annotated References may be helpful to students interested in the history and evolution of hypertext.

Motivated by Chris Nuzum's recent Tripping Up Memory Lane talk at HyperKult 2015, and Takashi's Design Concepts followup, I'm happy to continue the Traction history theme. I've removed the Confidential markings from the Proposal, and released it under the Creative Commons Attribution-Non-Commercial license (CC BY-NC 4.0), so you're welcome to read and use it for non-commercial purposes with attribution. Please link directly to this blog post.

Traction Software folk may make occasional blog posts referencing the Traction History project on this blog or on Twitter. Please follow @TractionTeam on Twitter, and feel free to message me as @roundtrip if you have questions.

The scribbled picture above from about the same time was my visualization of the Traction goal: To link and use anything that would cross a business person's desk using the Web as a platform, rather limiting hypertext to content stuffed inside silos like Lotus Notes.

Image

When we introduced Teampage in 2002, the word "blog" was often dogmatically defined as the unedited voice of a person. It was a tough slog to introduce a chronological stream of content created by a group of people rather than a single individual. The concept of an activity stream or Slack channel - a group of people talking in a shared space or channel - better captures what Teampage does.

Teampage extends the concept of an activity stream or channel to include:

  1. Editable entries with a full audit trail, including wiki history
  2. An extensible family of entry types (task, status, ...) and relationships (comment, ...)
  3. Dashboard and other views that collect, organize, and show entries in context
  4. A unified permission model that makes it simple to roll up entries across spaces and navigate or search by topic, context, author, or other criteria, see The Work Graph Model: TeamPage style

Clay Shirky got the concept in his 2003 review: Traction: Weblogs grow up in Social Software: A New Generation of Tools, Release 1.0 Vol 21, No. 5 (pdf). So did Jon Udell in his 2002 InfoWorld review: Getting Traction Traction's enterprise Weblog system gets a grip on corporate KM.

"Somewhere around your 30th responses to a response to a response in Notes, you start to wonder where all this group discussion leads. Somewhere around the fifth time a document marches by with yet more groupware annotations and digital yellow stickies attached, you wonder if it is really all that wise to have all of that group editing taking place. After all, isn’t the purpose of a group to tap the greater intellect represented by all those fine thingies in the group and, once tapped, move quickly to a better decision? Shouldn’t the purpose of groupware be to build more intelligence rather than more features into the product?

While it’s useful to share documents, hold ad hoc discussions and post groupwide projects, the essence of groupware may be the ability to manage a business outcome by divining a group's thought process."

Eric Lundquist, The Next Big Thing in Groupware PC Week 1 July 1996.
Team Problem Solving from Traction Product Proposal Oct 1997 

The core concept was granted US Patent 7,593,954.

The original business case for Teampage cited project work as the most important use. We've learned that it's valuable give people a straightforward way to link action tracking, messaging, and collaborative content creation. By creating and tracking tasks that can be directly attached to Teampage or external content, it's easy to see and stay on top of what's happening for you, by person, by channel, or in context of a specific Teampage project.

We learned how to model permissions to extend work across many internal as well as external groups such as the clients of a consulting firm, or the suppliers and customers of a manufacturer. The Teampage model of multiple permissioned spaces was added soon after the 1997 proposal. You can focus on any space (like a channel) as well as search and navigate across all spaces and entries you have permission to see.

By adding individual and group permissions to a space with an ACL model, internal and external groups share the same Teampage server while seeing and participating in just the set of projects and activities that are appropriate for every individual. Comment, task, and tags can cross spaces - so it's simple for internal team members to have a more private discussion linked to a specific paragraph of page or question posted by an external customer. Streams, discussions, notifications, digests, navigation, and search all obey the permissions defined by business rules enforced at the core level.

Email and Teampage has an interesting history. The 1997 proposal describes Traction as an alternative to broadcast email, but cites email as an important source for information to be be recorded and shared. An emailed Digest was one of the first features added to TeamPage based a beta customer's request. The Digest includes title links and content snippets gathered from the stream of events posted since the previous Digest was emailed. The content of each Digest is clipped to conform to what that person is permitted to see.

The Digest remains a popular features of Teampage, later augmented by email notifications with auto threaded email replies: your reply to a Teampage email notification is posted as a comment by you, linked at the right point in the discussion thread - requested by major consulting firm. I agree with Alan Lepofsky's point that email is one of many channels for messsages: we should flip our perspective to the stream of messages rather than the channel used to deliver each message, see Takashi's Eat your spinach post.

This combination of capabilities is particularly valuable for projects that intertwingle collaborative writing, team communication, and action tracking such as: quality management, product development, product support, consulting, and competitive intelligence. See The Work Graph Model: TeamPage style and Teampage Solutions.

A note on links: Although some of the links in the proposal still work, many point to sites which have been moved, including Doug Engelbart references which have moved from bootstrap.org to dougengelbart.org. In most cases a bit of creative Googling will find the referenced page in a different location. If people are interested, I'll publish an editable version of the Annotation References section that can be used to share updated locations. Please message @roundtrip on Twitter if you're interested and willing to pitch in to tracking down current references. Sigh.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

Tripping Up Memory Lane - Hyperkult 2015: Hypertext lessons learned talk by Traction Software CTO and Co-founder Chris Nuzum

Teampage hypertext journal: Design concepts, by Takashi Okutsu Director of Traction Software's Japanese Business Office

Traction Roots - Doug Engelbart - About Doug Engelbart's Journal and Traction.

Enterprise 2.0 - Letting hypertext out of its box - "I believe that the radical departure is the Web as the context of work: the universal medium, universal library, universal marketplace, and universal platform for personal as well as enterprise communication... In every previous generation hypertext system, the ability to read, search, link and communicate came with a terrible price: it might work well, but only if you were willing to put everything you wanted to work with into some sealed box, and convince everyone you wanted to work with to use the same box. From the earliest days of Vannevar Bush's Memex, the vision was universal, but the implementation was a siloed."

Intertwingled Work - Working and scaling like the Web. "... in the past, conversations could only be intertwingled across paper memos, faxes, written reports and email. Until the advent of the Web it wasn't possible to intertwingle conversations, networks, analysis and work in near-real time and global scale. Now that's trivial and essentially free with basic Web access."

The Work Graph Model: TeamPage style - The social dance of getting things done, dealing with exceptions, and staying aware of what’s going on around you

Tripping Up Memory Lane

July 16, 2015 · · Posted by Christopher Nuzum

Image
Last week I gave a talk at the Hyperkult 2015 conference. It was an honor to present there, especially since it was the 25th and final time the conference was held. This was my proposal for the talk:

Sometimes it seems like collaborative software projects are designed in an ahistorical vacuum. Like all our ideas are new. Maybe that’s because so much software is designed by young people fresh out of computer science programs heavy in programming and data structures, but often paying little more obeisance to the history of software than to acknowledge that once people programmed on punch cards, however that worked.

In 1996, after celebrating the 50th Anniversary of As We May Think at the Massachusetts Institute of Technology, and inspired by a long, encouraging talk with Doug Engelbart, I co-founded Traction Software (originally Twisted Systems, Inc.) and set out to design a memex-inspired literary machine for the augmentation of collective intelligence. In this talk, I’d like to demonstrate how the Traction Hypertext Journaling Engine underlying Traction Software’s TeamPage product borrows from and builds on insights and ideas from Vannevar Bush, Doug Engelbart, and Ted Nelson. I’ll also talk a bit about what ideas we’ve abandoned and why, and end with some thoughts on ideas that I think haven’t yet had their day.

I'd never given a talk in Germany before, but since the German word Vorlesung means "reading", I thought I had better be prepared with something I could read, even though that's not how I'm used to presenting.

For anyone interested, I've posted the script I prepared for the talk: Tripping Up Memory Lane Script.pdf (14.2MB). The PDF also includes high-resolution versions of the images I used in my slides.

I hope you'll enjoy.

Update: See the University of Lüneburg's video of this talk. Adobe Flash required for Chrome and Internet Explorer 10 and older.

My Part Wor ks

May 22, 2015 · · Posted by Greg Lloyd

ImageAbout 50 years ago, Andy van Dam joined the Brown University faculty with the world's second PhD in Computer Science (earned at the University of Pennsylvania). Today many of Andy’s friends, faculty, students and former students are celebrating his 50 years at Brown with Stone Age, Iron Age and Machine Age panels. [ June 9, 2015 update: See event video: Celebrate with Andy: 50 Years of Computer Science at Brown University ]

I’m part of the Stone Age cohort. In 1968 Andy and his Swarthmore colleague Ted Nelson gave a medicine show pitch to convince skeptical undergrads to sign up for an an insanely demanding one year, four course sequence then called Applied Math 101/102 and 103/104. I bit.

Starting with a tiny, two person department - and as a matter of principle - Andy recruited undergraduates as teaching and research assistants, a tradition that continues to this day. In an essay on the history of the Brown CS UTA [Undergraduate Teaching Assistant] program Andy said:

“Offering teaching and research assistant opportunities to undergrads,” he says, “was even more unusual, indeed was viewed with everything from skepticism to outright hostility. Hardly anyone said, ‘What a fantastic idea!’ Everyone was used to four years of preparation as an undergraduate, then n years of graduate work before you could contribute to a science. But we’re different. CS was and is young, experimental, and open for undergrads to contribute. And undergraduate participation in research in all fields has become commonplace, especially in the last decade.

In 1965, a single, intense full-year course could cover much of the breadth, if not the depth, of the systems-oriented portion of the discipline, not including theory, AI, numerical analysis, and a few other topics. Andy insisted that students couldn’t learn to be good programmers by solving small “toy” problems; they had to write significantly-sized programs, each taking multiple weeks.

Not just checking for the right answer but giving useful feedback on structure, style, and efficiency required careful reading and one-on-one help with concepts and debugging. In a class with forty students, it was impossible for one graduate TA and a professor to provide this level of attention, no matter how little sleep they were getting, so van Dam asked for help from students who had taken a prior programming course. In that first cohort, he remembers Bill Adcock; Dan Bergeron, who also subsequently got his PhD with Andy and became Chairman of the CS Department at UNH and went with him and a group of six other of Andy’s students for his first sabbatical in 1971 at the University of Nijmegen in Andy’s country of origin; and Dennis Ruggles, among others.

“The undergraduate teaching assistants,” Andy explains, “though they were initially called graders, didn’t just grade programs -- they not only provided one-on-one help to students but also became active participants in course design and in subsequent years read research papers and brought new ideas into the curriculum. In fact, they did everything graduate TAs did, becoming producers and not just consumers of education. We kept modifying the course as we went along, but the one constant was the highly-appreciated UTA system.

Few people appreciate it more than Ed Lazowska ‘72, who will lead the first (“Stone Age”) panel for Celebrate With Andy. He says, “I’m a faculty member precisely because of the UTA program. I went to grad school because Andy told me to. In some way, everything I do professionally today is due to him.

To provide feedback for the course, students wrote detailed, multi-page evaluations, something that was almost unheard of in 1965. As Bob Munck recalls, “Also after every class, the graders would sit around on the floor of Andy's office (later my office) and critique the lecture and him. I'd never seen anything like it.

On his commute home from work, Andy would listen to tape recordings of his lectures, filling the empty minutes with self-critiques: “Boy, was that a clumsy explanation! Get rid of the ‘um’s and the ‘you know’s.” Presentation skills are still something that van Dam is keenly interested in. “Today’s equivalent of ‘you know’ is ‘like’, which I try to stamp out in all students who work with me. I’ve given up on ‘awesome’.

An interesting aspect of the UTA program is that the system has essentially never been challenged by students due to the built-in checks and balances. “By having rotating TAs and detailed rubrics,” Andy says, “you create fairness. It’s a system that’s at least as fair as having a single faculty member grading. Besides, a single faculty member, even assisted by a few graduate TAs, can’t begin to read that many programs at the required level of detail, and students recognize that. Part of the checks and balances is that faculty members are responsible for assigning the final grades, and I personally review all borderline grades, hoping to find evidence for promotion to the next grade bin.

Originally something made up as they went along, the UTA program matured over a period of decades. Iteration and gradual regularization brought cross-course norms and standards that are used today by almost all Brown CS courses. “In my opinion,” says Andy, “We have the most systemic TA program, and there’s a well-defined appeal system in place to address any grading errors.

You can read about Andy’s honors and achievements on his Wikipedia page, and Professor Shriram Krishnamurthi's answer to Why is professor Andy Van Dam (Andy) so cool? Here are two short stories from me.

After Dinner

Image
Photo of Andy on WBGH Boston’s After Dinner show, broadcast live at 7:30PM Monday October 20, 1969.

After Dinner featured Andy van Dam, Chris Braun, Bev Hodgson (then Brown Daily Herald editor), Al Basile and myself talking about hypertext for 30 minutes on a stage set that was supposed to look like a professor’s living room, right next to Julia Child’s WGBH TV kitchen. Andy is pointing to photo of Chris Braun at the IBM 2250 Hypertext Editing System (HES) console.

AvD writes: You might mention that the topic wasn’t just hypertext per se, but the use of hypertext for non-linear narratives, esp. hypertext fiction as a new literary form (Montreal Expo (68) had just shown an audience-influenced branching movie, Burroughs’ Naked Lunch and Nabokov’s Pale Fire had been published, and experimentation was in the air. I’m sitting in the audience at the YURT inauguration symposium, listening to organizer John Cayley talk about “Cave Writing” and related spatial (immersive) hypertext projects that he and his students craft.

My Part Wor Ks

Image
Brown Computer Science circa 1969. Original edition.

The story as I recall: Most people chose an individual final project for AM 101/102. However, a few folk chose the two person assembler project.

A grader did an in person review with a two person team, noting a problem. One team member replied: “My part works, but he keeps passing me garbage.

It became a team programming mantra.

The first part was made into a button, with Wor ks spelling. The second part was the AvD equivalent of a secret handshake. Until now.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

Image

Andries van Dam - Wikipedia page

Celebrate With Andy: 50 Years Of CS At Brown - May 2015. An essay celebrating "the three golden anniversaries for the Brown CS family: fifty years of the UTA program, undergraduate involvement in research, and Andy van Dam at Brown."

Why is professor Andy Van Dam (Andy) so cool? - Quora, Jan 2015. I agree with Brown CS professor Shriram Krishnamurthi.

Pastepost - One more AvD story. The first public document from the first Hypertext Editing System was a press release announcing its own creation.

As We May Work - Andy van Dam - Tokyo 2008

The MIT/Brown Vannevar Bush Symposium - Celebrating the 50th anniversary of Bush's As We May Think. Organized and MC'd by Andy van Dam

Hypertext Editing System - Wikipedia page. Photo by Greg Lloyd.

Enterprise 2.0 - Are we there yet?

November 21, 2014 · · Posted by Greg Lloyd

ImageAndrew McAfee writes Nov 20, 2014: "Facebook’s recent announcement that it’s readying a version of its social software for workplaces got me thinking about Enterprise 2.0, a topic I used to think a great deal about. Five years ago I published a book with that title, arguing that enterprise social software platforms would be valuable tools for businesses...

Why did it take so long? I can think of a few reasons. It’s hard to get the tools right — useful and simple software is viciously hard to make. Old habits die hard, and old managers die (or at least leave the workforce) slowly. The influx of ever-more Millennials has almost certainly helped, since they consider email antediluvian and traditional collaboration software a bad joke.

Whatever the causes, I’m happy to see evidence that appropriate digital technologies are finally appearing to help with the less structured, less formal work of the enterprise. It’s about time.

What do you think? Is Enterprise 2.0 finally here? If so, why now? Leave a comment, please, and let us know."

Andrew – As we’ve discussed in the past, I don’t believe there’s a specific ‘Are we there yet?’ for Enterprise 2.0.

The lessons I learned from your excellent book and research are still relevant today. Enterprise 2.0 technology enables but does not guarantee organizational change. Some organizational change is invented and purposeful, some is serendipitous and emergent.

The effect of new technology on an enterprise is too often like picking up and shaking a sleepy beehive.

We’ve come a long way towards the vision that software and devices used inside a company will become more like software, Web services and mobile devices people use at home. Enterprise software and services need to meet the same expectations for clarity, any time / any where access, and easy of use that people expect at home, which shakes markets as well as assumptions. Tracking the relationship of Apple IBM from Nov 2009 through Nov 2014 (and their market cap) is an instructive example.

As Peter Drucker taught, organizations need to adapt and innovate to make use of these capabilities, which opens the door to new technology, capabilities, and markets for enterprise software and services at every layer of the stack. Which opens the door to new organizational challenges and opportunities…

I’m not surprised that this takes time - and like Bill Buxton’s analysis in his Long Nose of Innovation article from 2008.

I’ll also keep my faith in Peter Drucker and Doug Engelbart as the twin patron Saints of Enterprise 2.0. As I said in Nov 2009, you have your own sub-numinous stake in the game!

cheers,
Greg

Related

Enterprise 2.0, Finally? Andrew McAfee, Nov 20, 2014 (This blog post was originally posted as a comment)

Enterprise 2.0: New Collaborative Tools for Your Organization's Toughest Challenges Andrew McAfee, Harvard Business Review Press, Nov 2009

The Long Nose of Innovation Bill Buxton, Bloomberg Business Week, Jan 8, 2008

Enterprise 2.0 Schism Greg Lloyd, Nov 9, 2009

Named Data Networking - Boffin Alert

September 8, 2014 · · Posted by Greg Lloyd

ImageOn Sep 4, 2014 the Named Data Networking project announced a new consortium to carry the concepts of Named Data Networking (NDN) forward in the commercial world. If this doesn't sound exciting, try The Register's take: DEATH TO TCP/IP cry Cisco, Intel, US gov and boffins galore. What if you could use the internet to access content securely and efficiently, where anything you want is identified by name rather than by its internet address? The NDN concept is technically sweet, gaining traction, and is wonderfully explained and motivated in a video by its principle inventor and instigator Van Jacobson. Read on for the video, a few quotes, reference links, and a few thoughts on what NDN could mean for the Internet of Things, Apple, Google and work on the Web. Short version: Bring popcorn.

For a short non-technical introduction, see Wade Roush's Sep 2012 piece on Van Jacobson and Content Centric Networking The Next Internet? Inside PARC’s Vision of Content Centric Networking. Background: Jacobson's work on CCN begot the NDN project, where he is now a Principle Investigator. A few quotes from Roush's story:

The fundamental idea behind Content Centric Networking is that to retrieve a piece of data, you should only have to care about what you want, not where it’s stored. Rather than transmitting a request for a specific file on a specific server, a CCN-based browser or device would simply broadcast its interest in that file, and the nearest machine with an authentic copy would respond. File names in a CCN world look superficially similar to URLs (for example, /parc.com/van/can/417.vcf/v3/s0/Ox3fdc96a4…) but the data in a name is used to establish the file’s authenticity and provenance, not to indicate location.

It’s easy to see how much sense this makes compared to the current client-server model. Say I’m using my Apple TV box to browse my Flickr photo collection on my big-screen TV. To get each photo, the Apple TV has to connect to Flickr, which is hosted on some remote data center owned by Yahoo—it could be in Utah or North Carolina, for all I know. The request has to travel from the Apple TV over my Wi-Fi network, into Comcast’s servers, then across the Internet core, and finally to Yahoo. Then the photos, which amount to several megabytes each, have to travel all the way back through the network to my TV.

But the photos on Flickr are just copies of the originals, which are stored on my camera and on my laptop, about 15 feet away from my TV. It would be much smarter and more economical if the Apple TV could simply ask for each photo by name—that is, if it could broadcast its interest in the photo to the network. My laptop could respond, and I could keep browsing without the requests or the data ever leaving my apartment. (In Jacobson’s scheme, file names can include encrypted sections that bar users without the proper keys from retrieving them, meaning that security and rights management are built into the address system from the start.)

“The simplest explanation is that you replace the concept of the IP address as the defining entity in the network with the name of the content,” says Lunt. “Now all the talk in the network is about ‘Have you seen this content?’ and ‘Who needs this content?’ as opposed to ‘What is the routing path to particular terminus in the network?’ It’s a simple idea, but it makes a lot of things possible...

“One of the things that’s intriguing about not having to go to the source is that you could start to think about implementing applications differently,” Lunt says. “You could build apps that don’t have any notion of a server at all. So you could have Twitter without Twitter or Facebook without Facebook—that is, without having to have a major investment in hosting content, because the network is caching it all over the place.

Such architectures might give users more control over privacy and security of their data, and let them share their own data across devices without having to go through proprietary services like Apple’s iCloud, PARC executives say.

“What Apple is trying to do with iCloud is to say: You shouldn’t have to care which device you got an app on, or which device you took a photo on, whether it was your iPad or iPhone or MacBook Air. You just want your content to be on the other devices when you want it,” says Steve Hoover, CEO of PARC. “That validates our vision. But the way they are solving that puts more load on the network than it needs to, and it requires consumer lock-in. So Apple may be a user of this [CCN] technology one day, because it will make it easier. On the other hand, they could also hate it, because it will make it a lot easier for other people to provide that capability of getting the content whenever you want.

In my option, one of the technically sweetest characteristics of NCN is its relationship to current TCP/IP and networking protocols (quotes from NDN Architecture: Motivation and Details):

Like IP, NDN is a “universal overlay”: NDN can run over anything, including IP, and anything can run over NDN, including IP. IP infrastructure services that have taken decades to evolve, such as DNS naming conventions and namespace administration or inter-domain routing policies and conventions, can be readily used by NDN. Indeed, because NDN’s hierarchically structured names are semantically compatible with IP’s hierarchically structured addresses, the core IP routing protocols, BGP, IS-IS and OSPF, can be used as-is to deploy NDN in parallel with and over IP. Thus NDN’s advantages in content distribution, application-friendly communication, robust security, and mobility support can be realized incrementally and relatively painlessly...

Communication in NDN is driven by the receiving end, i.e., the data consumer. To receive data, a consumer sends out an Interest packet, which carries a name that identifies the desired data (see Figure 2). A router remembers the interface from which the request comes in, and then forwards the Interest packet by looking up the name in its Forwarding Information Base (FIB), which is populated by a name-based routing protocol. Once the Interest reaches a node that has the requested data, a Data packet is sent back, which carries both the name and the content of the data, together with a signature by the producer’s key (Figure 2). This Data packet follows in reverse the path taken by the Interest to get back to the consumer. Note that neither Interest nor Data packets carry any host or interface addresses (such as IP addresses); Interest packets are routed towards data producers based on the names carried in the Interest packets, and Data packets are returned based on the state information set up by the Interests at each router hop (Figure 3).

The router stores in a Pending Interest Table (PIT) all the Interests waiting for returning Data packets. When multiple Interests for the same data are received from downstream, only the first one is sent upstream towards the data source. Each PIT entry contains the name of the Interest and a set of interfaces from which the Interests for the same name have been received. When a Data packet arrives, the router finds the matching PIT entry and forwards the data to all the interfaces listed in the PIT entry. The router then removes the corresponding PIT entry, and caches the Data in the Content Store. Because an NDN Data packet is meaningful independent of where it comes from or where it may be forwarded to, the router can cache it to satisfy future requests. Because one Data satisfies one Interest across each hop, an NDN network achieves hop-by-hop flow balance...

Names

NDN design assumes hierarchically structured names, e.g., a video produced by PARC may have the name/parc/videos/WidgetA.mpg, where ‘/’ indicates a boundary between name components (it is not part of the name). This hierarchical structure is useful for applications to represent relationships between pieces of data. For example, segment 3 of version 1 of the video might be named /parc/videos/WidgetA.mpg/1/3. The hierarchy also enables routing to scale. While it may be theoretically possible to route on flat names (see ROFL), it is the hierarchical structure of IP addresses that enables aggregation, which is essential in scaling today’s routing system. Common structures necessary to allow programs to operate over NDN names can be achieved by conventions agreed between data producers and consumers, e.g., name conventions indicating versioning and segmentation.

Name conventions are specific to applications but opaque to the network, i.e., routers do not know the meaning of a name (although they see the boundaries between components in a name). This allows each application to choose the naming scheme that fits its needs and allows the naming schemes to evolve independently from the network.

I haven't quoted from short sections on Data Centric Security, Routing and Forwarding, Intelligent Data Plane, Caching, or Intellectual Property Approach and open source. You should read NDN Motivation & Details, then much more from named-data.net if either your head exploded, or you are jumping up and down in your seat with questions and objections.

Much of this is QED Marketing - I told you how it works, not what it means for you. Here are a few thoughts:

1) Secure efficient transport of content crossing many boundaries is a hard problem, getting harder as the number of people, things, and places on the Web grow, and as people look for a seamless and trusted way to deal with things they care about at home and at work. For example, how could Apple (or Google) leverage NDN to deliver on an internet of your things? How might players other than the giants leverage NDN to compete?

2) NDN offers the possibility of doing a lot of the hard work at the network level, which is a win if it offers a economic benefit to those who pay for the fabric of the internet, and opportunities to invent and grow scalable businesses more effectively. For example, what could change if Amazon offered NDN as an Amazon Web Service?

3) NDN might offer an appropriate secure, flexible framework for connecting people to content at work. Businesses use siloed applications for for transactional data for good reasons: they are simpler to build, (potentially) more secure, and (potentially) more flexible than old style monolithic business applications if they become sources of content linked together at a higher level of an application stack. NDN might be a great protocol to build flexible, secure, extensible business applications connecting people to the content they want - and are allowed to use.

With respect to the network issues, I'm a fan, not an expert, but the NDN proposal seems to share many of the (relatively) simple, scalable, decentralized characteristics that fueled the growth of the Web and evolution of TCP/IP. NDN seems to be most attractive for big content, particularly where multicast style delivery and caching can delivery big bandwidth and responsiveness improvements, but it looks like a lot of thought has gone into efficient localized delivery. Likewise, management of a very large, frequently changing name space is a challenge, which also seems to have gotten a lot of intelligent attention.

With Cisco and Huawei on board as founding industrial partners of the NDN Consortium, you can bet that a lot of caching routers can be sold, and NDN routing technology will take the fast track if there's economic payback for NDN, which will drive better payback, faster adoption, etc.

The good thing is the program has advanced to the stage where many of these questions can answered by experiment - we shall see.

Will the NDN Consortium take off? Will Google, Apple and Microsoft jump in? Or will NDN join the queue of technically sweet solutions that never really get off the ground? I'm optimistic that NDN has the right technical characteristics and pedigree, with smart experienced people leading the charge. With the Internet of Things and secure content distribution efficiencies as economic drivers, I hope we'll all benefit from NDN's content delivery model as the next stage of the Web's evolution. If you're not in the battle, bring popcorn and watch - it should be a good show.

Related

Named Data Networking Architecture: Motivation & Details The best short technical overview I've found of the objectives and approach of the Named Data Networking project. Read the overview to get quick idea of how content is named, the NDN security and caching model, how NDN works over (or under) TCP, scaling issues, and more.

A New Way to Look at Networking - Van Jacobson's Aug 2006 Google Tech talk on TCP and Content Centric Networking (CCN). CCN is the title of Jacobson's Xerox PARC project, which became "the single biggest internal project at PARC." CCN led to the formation of the Named Data Networking project as a National Science Foundation funded Future Internet Architecture program in Sep 2010. Jacobson is currently a Principle Investigator of the NDN project. See Van Jacobson speaks on Content Centric Networking for a longer (three hour) and slightly earlier version of Jacobson's CCN talk presented as a Future Internet short course, including slides.

Reinventing the Web II (Aug 2014) The Web won vs "better" models by turning permanence into a decentralized economic decision. Why isn't the Web a reliable and useful long term store for the links and content people independently create? What can we do to fix that? Who benefits from creating spaces with stable, permanently addressable content? Who pays? What incentives can make Web scale permanent, stable content with reliable bidirectional links and other goodies as common and useful as Web search over the entire flakey, decentralized and wildly successful Web? NDN is the sweetest and most credible global technical approach I've seen.

Continuity and Intertwingled Work (Jun 2014) A level above an Internet of Things: seamless experience across devices for you, your family, your health and trusted service providers, at home and at work.

Intertwingled Work (Jul 2010) No one Web service or collection of Web servers contain everything people need, but we get along using search and creative services that link content across wildly different sources. The same principal applies when you want to link and work across wildly diverse siloed systems of record and transactional databases.

Thought Vectors - Ted Nelson: Art not Technology (Jul 2014) Ted Nelson should be smiling - but I won't hazard a guess. From what I see, everything in NDN seems compatible if not influenced by the Docuverse, Tumbler, and fine grain content addressable network architecture that Nelson described in detail in his 1987 book Literary Machines. I believe NDN provides secure, scalable, fine grain, and upwards compatible networking that could connect the front end and back end Xanadu architecture that Nelson describes in Literary Machines. I'll follow up on this with a separate Boffin alert.

Linked, Open, Heterogeneous

August 31, 2014 · · Posted by Greg Lloyd

Image Art, Data, and Business Duane Degler of Design For Context posted slides from his 5 April 2014 Museums and the Web talk, Design Meets Data (Linked, Open, Heterogeneous). Degler addresses what he calls the LAM (Libraries, Archives, Museums) Digital Information Ecosystem. I believe the same principles apply when businesses connect internal teams, external customers, external suppliers, and partners of all sorts as part of their Business Information Ecosystem. Read Degler's summary and slides, below:

"The tide of available information continues to rise. The opportunities that come from open access, linked data, sharing resources with other institutions, and standards-based data are enticing - and perhaps overwhelming?

Emerging design approaches help you find ways to make the most of your opportunities for new types of interactions and engagement with Information Objects. They focus on:

- Exploration, serendipity, use: Rich, relevant design requires an intimate understanding of information and the way people interact with it. It's more than attractive styling - although that's important. It's about people engaging in ways that stimulate the intellect and the experience. People need to find information, use it, relate other information to it, and share it for decades to come.

- Scalability, persistence, authority: Rich, relevant design also takes the long view. Understanding that the integrity of the information matters. This is increasingly important as we move toward more linked, open, and born digital cultural information.

Your institution becomes a gateway to an ecosystem of artistic imagery, scholarly insights, history, perspectives, and related objects. Other people will use your information to create new interpretations and works, which then build on what you hold. Curating information may be perceived as a burden (to be made easier!), yet it is a significant opportunity to reinforce the value and authority of institutions that enhance the information ecosystem."

Related

Dark Matter by Michael Peter Edson 19 May 2014. "The dark matter of the Internet is open, social, peer-to-peer and read/write—and it’s the future of museums" an important essay on the opportunity and mission for museums and cultural institutions: "We’re so accustomed to the scale of attention that we get from visitation to bricks-and-mortar buildings that it’s difficult to understand how big the Internet is—and how much attention, curiosity, and creativity a couple of billion people can have."

Thought Vectors - Vannevar Bush and Dark Matter (2014) Inspired by Michael Edson's essay. Just as Bush suggested in July 1945, I believe there's a need for people to act as explorers, guides, and trail blazers over knowledge they know and love. You can experience that personal knowledge and passion on a tour, at a talk, or in a conversation on a bus, at a party - anywhere you meet someone who loves one of these institutions. I think it's particularly valuable to have trail blazers who are also skilled professionals personally represent and communicate the values, knowledge, and heritage of their museum, just as a great reference librarian becomes a library's ambassador.

Reinventing the Web II (2014) Why isn't the Web a reliable and useful long term store for the links and content people independently create? What can we do to fix that? Who benefits from creating spaces with stable, permanently addressable content? Who pays? What incentives can make Web scale permanent, stable content with reliable bidirectional links and other goodies as common and useful as Web search over the entire flakey, decentralized and wildly successful Web?

Intertwingled Work (2010) No one Web service or collection of Web servers contain everything people need, but we get along using search and creative services that link content across wildly different sources. The same principal applies when you want to link and work across wildly diverse siloed systems of record and transactional databases.

Thought Vectors - Ted Nelson: Art not Technology

July 5, 2014 · · Posted by Greg Lloyd

ImageThe technoid vision, as expressed by various pundits of electronic media, seems to be this: tomorrow's world will be terribly complex, but we won't have to understand it. Fluttering though halestorms of granular information, ignorant like butterflies, we will be guided by smell, or Agents, or leprechauns, to this or that pretty picture, or media object, or factoid. If we have a Question, it will be possible to ask it in English. Little men and bunny rabbits will talk to us from the computer screen, making us feel more comfortable about our delirious ignorance as we flutter through this completely trustworthy technological paradise about which we know less and less.

To give up on human understanding is to give up hope, what we call in English "a counsel of despair." I think there is hope for much better and more powerful software designs that will give ordinary people the power over computers that they have always wanted - power with complete understanding. But that requires inspired software design, which I believe is art and not technology.

I believe the technoid vision does not comprehend what is humanly desired, humanly needed, and humanly possible. Especially the need and possiblity of human understanding. So excuse me from the butterfly crowd; I hope you will come with me to where understanding may be found.

Ted Nelson
The Future of Information
ASCII Corporation, Japan 1997
Image courtesy of Computer History Museum

This quote from Ted Nelson's 1997 book makes a point similar to Nelson's closing point in his July 2014 interview with Gardner Campbell as well as statements in his 2011 Possiplex autobiography, and 1975 Computer Lib / Dream Machines. Nelson sees computer technology as a medium for creative expression, not an end in itself, or a cheap replacement for human creativity. He cites film directors among his primary inspirations and heros, noting that his personal ephipany came in the early 1960's when he learned that it was possible to connect computers to screens. Nelson invented the terms hypertext and hypermedia to describe the new capabilities that he envisioned. During his 2014 interview Nelson cited the example of Orson Wells. For Ted Nelson, what you see on a computer screen and interact with should be the result of human creative intelligence applied through the use of new engines of expression over an endlessly evolving intertwingled corpus of literature. Using Nelson's cinema analogy, history put him in a position where he would have to invent the motion picture camera to achive his goals, but I believe his motivation was to become the seminal director and intellectual father of the new media which are his earliest and most influential inventions.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

Intertwingled, The Festschrift-- Ebook celebrating Ted Nelson Day at Chapman University, 2014 (Springer-Verlag) (via @TheTedNelson, 12 Jul 2015) A free Springer ebook edited by Douglas R. Dechow and Daniele C. Struppa. Chapters by Alan Kay, Brewster Kahle, Belinda Barnet, Ken Knowlton, Dame Wendy Hall, and others. Closing chapter What Box? by Ted Nelson. I highly recommend this book.

Living The Dreams: A Conversation With Ted Nelson Published on Jul 5, 2014. Dr. Ted Nelson speaks with Dr. Gardner Campbell about research, fantics, computer liberation, and the ongoing struggle between schooling and learning. A conversation undertaken in support of "Living The Dreams: Digital Investigation and Unfettered Minds," a digital engagement pilot of Virginia Commonwealth University's UNIV 200, Inquiry and the Craft of Argument.

Ted Nelson talk - Possiplex book launch From Welcome to Possiplex : An Autobiography of Ted Nelson party at the Internet Archive on Oct 8, 2010.

Possiplex: Movies, Intellect, Creative Control, My Computer Life and the Fight for Civilization, an autobiography of Ted Nelson, Mindful Press, Feb 2011.

Triangulation 164 - Conversation with Ted Nelson Leo Laporte's July 2014 conversation with Ted Nelson, broadcast Aug 18, 2014 on TWiT.tv. On hypertext, Xanadu - and being a media guy. "To me, all media are alike. You think about what are the effects you want - and you think about what are the technicalities it will take to give you those effects. So when I took a computer course in graduate school, I thought 'Holy smoke, you can put interactive screens on them'... Interactive screens were instantly obvious to me."

Computer Lib / Dream Machines A brief description of Ted Nelson's 1974 book. Ordering information for an authorized 2014 replica reprint, which I highly recommend.

Ladies and gentlemen, the age of prestidigitative presentation and publishing is about to begin. Palpitating presentations, screen-scribbled, will dance to your desire, making manifest the many mysteries of winding wisdom. But if we are to rehumanize an increasingly brutal and disagreeable world, we must step up our efforts. And we must hurry. Hurry. Step right up.

Theodor H. Nelson, “Barnum-Tronics.
Swarthmore College Alumni Bulletin, Dec 1970, 12-15
Quoted from Dream Machines, 1975
See New Media Reader Computer Lib / Dream Machines excerpt

Video Archive MIT / Brown Vannevar Bush Symposium: A Celebration of Vannevar Bush's 1945 Vision, An Examination of What Has Been Accomplished, and What Remains to Be Done. Oct 12-13 1995, MIT. Talks and panel discussion with Doug Engelbart, Ted Nelson, Andy van Dam, Tim Berners-Lee, Alan Kay and others. See also ACM Interactions summary (free access), transcript of day 1 and day 2 panels.

Thought Vectors - What Motivated Doug Engelbart

June 23, 2014 · · Posted by Greg Lloyd

ImageBy "augmenting human intellect" we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble. And by "complex situations" we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers--whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human "feel for a situation" usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids. 1a1

Man's population and gross product are increasing at a considerable rate, but the complexity of his problems grows still faster, and the urgency with which solutions must be found becomes steadily greater in response to the increased rate of activity and the increasingly global nature of that activity. Augmenting man's intellect, in the sense defined above, would warrant full pursuit by an enlightened society if there could be shown a reasonable approach and some plausible benefits. 1a2

Doug Engelbart Augmenting Human Intellect: A Conceptual Framework. SRI Summary Report AFOSR-3223, October 1962

This week's Thought Vectors in Concept Space assignment is a blog post based on a nugget from the works of Doug Engelbart. I like this quote because Doug talks clearly about what motivates his research; what motivated his life's work.

To me, it's interesting to note that Doug wrote his report in 1962 just as NASA was launching Project Apollo, and not long after President John F. Kennedy announced his challenge to land on the Moon. Project Apollo was arguably the most challenging engineering project of the 20th century, designing and testing families of new engineering systems as well as new classes of hardware. But Project Apollo was more than an engineering project; it was a grand challenge that motivated NASA to do its best and engaged most of the world as spectators in a high stakes, highly visible race to the Moon.

Doug's vision was also an engineering vision, designing and testing new human/computer systems as well as new classes of software. The paragraphs, links, paragraph-grain addresses, relationships, viewspecs and visualizations of Augment/NLS made Doug's thought vectors as real as they could possibly be, recording, linking and animating thoughts in a way that could never be done with paper plans and records. But like Project Apollo, Doug's vision was more than an engineering project; it was and is a grand challenge, to find better ways to enable people to solve critical problems, part of a trail on augmentation started by Vannevar Bush that will never end.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

Doug Engelbart | 85th Birthday Jan 30, 2010 - Blog post celebrating Doug Engelbart's 85th birthday, includes quotes and links to resources. One of the quotes from Engelbart's talk at the Brown/MIT Vannevar Bush Symposium became the tag line for this VCU course.

DougEngelbart.org: The Doug Engelbart Institute was was conceived by Doug Engelbart to further his lifelong career goal of boosting our ability to better address complex, urgent problems. It contains an excellent history, archive of papers, photos and other published resources as well as links to Doug's current projects.

Video Archive MIT / Brown Vannevar Bush Symposium: A Celebration of Vannevar Bush's 1945 Vision, An Examination of What Has Been Accomplished, and What Remains to Be Done. Oct 12-13 1995, MIT. Talks and panel discussion with Doug Engelbart, Ted Nelson, Andy van Dam, Tim Berners-Lee, Alan Kay and others. See also ACM Interactions summary (free access), transcript of day 1 and day 2 panels.

Augmenting Human Intellect: A Conceptual Framework. by Douglas C. Engelbart, October 1962 (SRI AUGMENT, 3906) A work Doug referred to as the bible of his research agenda, it also outlines the motive for his work: enabling groups of people to respond to the increasingly complex and urgent problems of humanity. If you want to read Doug's original works, start here.

Reinventing the Web II

June 16, 2014 · · Posted by Greg Lloyd

ImageUpdated 19 Jun 2016 Why isn't the Web a reliable and useful long term store for the links and content people independently create? What can we do to fix that? Who benefits from creating spaces with stable, permanently addressable content? Who pays? What incentives can make Web scale permanent, stable content with reliable bidirectional links and other goodies as common and useful as Web search over the entire flakey, decentralized and wildly successful Web? A Twitter conversation.

How the Web was Won

I believe Tim Berners-Lee's original HTTP and HTML protocols succeeded beyond his original vision of a globally scalable, loosely coupled network of Web pages that anyone could edit. The fact that his original protocols were simple, decentralized, and free for anyone to use were essential to success in a world of competing proprietary Internet publishing and commerce "standards" from Microsoft and others. But in my opinion, the Web won by turning permanence and stability into a decentralized economic decision.

Berners-Lee's original W3C protocols appeared at the right time to open clear field opportunities for distributed publishing, marketing, sales and advertising that fueled the Web's growth and evolution. Recapping the argument from my first Reinventing the Web post:

The idea that any sensible person would rely on a global hypertext system where links on one computer pointed at locations on another computer which would break whenever the remote computer was unilaterally moved, renamed, taken off line or abandoned seemed absurd.

The idea that you would have no way to know what incoming links would break when editing or refactoring content seemed just as bad.

The Word Wide Web protocols looked like they would work for relatively small cooperative groups like CERN who could keep things from breaking by having shared goals, and using peer pressure plus out of band communication to keep distributed content alive.

Actually that intuition was pretty good, because the World Wide Web took off in a direction based on other incentives compatible with those assumptions - and grew like crazy because unlike alternatives, it was was simple, massively scalable, cheap and eliminated the need for centralized control.

1) The Web became a distributed publishing medium, not the fabric for distributed editing and collaboration that Tim Berners-Lee and others envisioned. People and Web publishing engines like Amazon created content and kept it online while it had economic value, historical value (funded by organizations), or personal value. Content hosting became cheap enough for individuals or tiny groups. Advertising supported content became "free".

2) Search engines spanned the simple Web. Keeping content addressable now gained value since incoming links not only allowed people to bookmark and search engines to index what you had to publish (or sell), but the incoming links gained economic value through page rank. This provided even greater motivation to edit without breaking links, and to keep content online while it retained some economic, organizational or personal value.

3) People and organizations learned how to converse and collaborate over the Web by making it easy to create addressable content others could link to. The simple blog model let people just add content and have it automatically organized by time. The Wiki model required more thought and work to name, organize and garden content, but also creates stable, addressable islands of pages based on principals that reward cooperative behavior.

4) Search engines, syndication and notification engines built over the Web's simple, scalable protocols connected the Web in ways that I don't think anyone really anticipated - and work as independent and competing distributed systems, making rapid innovation possible.

Tim Berners-Lee made an inspired set of tradeoffs. Almost every concept of value on the Web: search engines, browsers, notification is built over his simple, open, highly scalable architecture.

I believe it's possible to provide what TBL calls "reasonable boundaries" for sharing sensitive personal or organizational data without breaking basic W3C addressable content protocols that makes linking and Web scale search valuable. That should be the goal for social and business software, not siloed gardens with Web proof walls.

Building a better Web over the Web we have

Telephone companies used to call their simplest and cheapest legacy service POTS (Plain Old Telephone Service). I believe it's possible to build a richer and more stable Web over POWS (Plain Old Web Services) without necessarily starting from scratch.

One answer to "who benefits?" and "who pays?" are the businesses who benefit from a richer and more stable Web connecting the systems they use to get work done. Stable fine-grain links and bi-directional relationships connecting systems of record and systems of engagement open the door to business systems that are more flexible, effective, simple to develop, and pleasant to use - more like the public Web than traditional line of business systems.

Museums, libraries, and archives such as Brewster Kahle's Internet Archive, the Library of Congress and others have a mission to collect and curate our cultural heritage and knowledge. The Internet Archive shows how little it costs to collect and index an archive of the content of the visible Web.

Commercial publisher monetize their archive, but have weaker economic incentives to maintain stable links to content outside their own domain.

Commerce sites and providers of consumer-focused Web services may have the greatest economic incentive for deep linking with stable references and relationships spanning devices you own, your home, your health and healthcare providers, your car, your family - and your work, see Continuity and Intertwingled Work.

If I'm right, there are economic incentives for Web content creators to make their work more linkable, visible and useable using straightforward, decentralized, and non-proprietary upwards compatible extensions of Plain Old Web Services.

I believe that indices spanning permalinked locations as well as incoming and outgoing permalink references to content in "stable islands in the storm tossed sea" can be created and maintained in near real time at Web scale, preserving the integrity of links to archival content distributed across the Web.

For example, any domain could publish an index to its permalinked content. Other domains implementing the same protocol could make incoming references to that content by permalink. This is a simple decentralized protocol, no more magical than the published external references that a link editor or dynamic linking system uses to resolve references connecting independently compiled modules of code.

Domains that agree to implement the same protocol, and use permalink (URI) references for content in other compatible domains then have a more stable, decentralized model for permanent links. If domains also publish their own permalink outgoing references (external as well as internal), a Web level service could build and maintain reliable inverted indices of bi-directional internal and domain spanning links. The federation of such domains could be spidered by any number of independently developed services, creating a more stable and useful Web as a decentralized service without breaking the simple Web protocols that every browser and other Web service relies on.

I don't know who has suggested this before; it seems obvious, and is a straw man not a solution. I'm using it to argue that we can and should invent ways to improve the capabilities of the Web using the same simple, decentralized philosophy that made the Web wildly successful versus "better" hypertext systems.

See Michael Peter Edson's Dark Matter essay and my Thought Vectors - Vannevar Bush and Dark Matter response.

Related

Update 19 Jun 2016 See the Internet Archive Decentralized Web Summit, 8-9 June 2016 Locking the Web Open. See videos of the Summit and Brewster Kahle's notes: "Building a web that is decentralized— where many websites are delivered through a peer-to-peer network– would lead to a the web being hosted from many places leading to more reliable access, availability of past versions, access from more places around the world, and higher performance. It can also lead to more reader-privacy because it is harder to watch or control what one reads. Integrating a payments system into a decentralized web can help people make money by publishing on the web without the need for 3rd parties. This meeting focused on the values, technical, policy, deployment issues of reinventing basic infrastructure like the web."

Reinventing the Web (2009) Ted Nelson, Tim Berners-Lee and the evolution of the Web. Ted Nelson wants two-way links, stable transclusion, micropayments. Tim Berners-Lee wants a new Web with open, linked data. I believe that most of what they want can be delivered using the current flakey, decentralized and wildly successful Web as the delivery medium for richer, more stable, more permanent internal models, as stable federations of islands in a storm-tossed sea.

The Internet's Original Sin by Ethan Zuckerman, The Atlantic, Aug 14, 2014. Ethan confesses his role - invention of the pop-up Ad - stating "It’s obvious now that what we did was a fiasco, so let me remind you that what we wanted to do was something brave and noble." He makes a convincing case that the apple in the Web's garden is Investor storytime "... when someone pays you to tell them how rich they’ll get when you finally put ads on your site." A darkly comic but heartfelt essay on the past and future economy of the Web: "It's not too late to ditch the ad-based business model and build a better web"

Intertwingled Work (2010) No one Web service or collection of Web servers contain everything people need, but we get along using search and creative services that link content across wildly different sources. The same principal applies when you want to link and work across wildly diverse siloed systems of record and transactional databases.

Dark Matter: The dark matter of the Internet is open, social, peer-to-peer and read/write—and it’s the future of museums by Michael Peter Edson on May 19, 2014.

Continuity and Intertwingled Work (2014) A level above an Internet of Things: seamless experience across devices for you, your family, your health and trusted service providers, at home and at work.

Reinventing the Web III (2014) followup Twitter conversation with @zeynep, @jeffsonstein, @kevinmarks, and @roundtrip.

The Web of Alexandria (2015) by Bret Victor "We, as a species, are currently putting together a universal repository of knowledge and ideas, unprecedented in scope and scale. Which information-handling technology should we model it on? The one that's worked for 4 billion years and is responsible for our existence? Or the one that's led to the greatest intellectual tragedies in history?"

And Victor's followup post "Whenever the ephemerality of the web is mentioned, two opposing responses tend to surface. Some people see the web as a conversational medium, and consider ephemerality to be a virtue. And some people see the web as a publication medium, and want to build a "permanent web" where nothing can ever disappear. Neither position is mine. If anything, I see the web as a bad medium, at least partly because it invites exactly that conflict, with disastrous effects on both sides."

Update 13 Jul 2014 Added new section headings, added the inline recap and economic benefit examples, added a link to a Jul 2014 Reinventing the Web III Twitter conversation on the same topic.

Update 23 Aug 2014 Added link and brief note on Ethan Zuckerman's fine essay on advertising as the Internet's Original Sin.

Update 29 May 2015 Added links to Web of Alexandria and followup by Bret Victor on why the Web is a bad medium.

Update 19 Jun 2015 Added link to Brewster Kahle's summary of the Internet Archive's Decentralized Web Summit of 8-9 June 2016.

Thought Vectors - Vannevar Bush and Dark Matter

June 13, 2014 · · Posted by Greg Lloyd

ImageOn Jun 9 2014 Virginia Commonwealth University launched a new course, UNIV 200: Inquiry and the Craft of Argument with the tagline Thought Vectors in Concept Space. The eight week course includes readings from Vannevar Bush, J.C.R. Licklider, Doug Engelbart, Ted Nelson, Alan Kay, and Adele Goldberg. Assignments include blog posts and an invitation to participate on Twitter using the #thoughtvectors hashtag. The course has six sections taught at VCU, and an open section for the rest of the internet, which happily includes me! This week's assignment is a blog post based on a nugget that participants select from Vannevar Bush's 1945 essay As We May Think. Here's mine:

Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world's record, but for his disciples the entire scaffolding by which they were erected.

This quote is part of a longer section in Bush's essay describing his concept of the Memex, a desktop machine imagined as an extension of 1940's microfilm and vacuum tube technology.

This quote stuck me while reading Michael Peter Edson's essay Dark Matter published on Medium.com in May 2014.

Edson's essay begins "The dark matter of the Internet is open, social, peer-to-peer and read/write—and it’s the future of museums" explaining:

I am talking about museums, libraries, and archives—heritage, culture, knowledge, and memory institutions—and there is really nothing like them on the face of the earth. And whether we’ve realized it or not, my colleagues and I who work with technology in these institutions have been participating in an extraordinary project — the building of a planetary scale knowledge sharing network for the benefit of everyone on earth.

He writes:

Despite the best efforts of some of our most visionary and talented colleagues, we’ve been building, investing, and focusing on only a small part of what the Internet can do to help us accomplish our missions.

90% of the universe is made of dark matter—hard to see, but so forceful that it seems to move every star, planet, and galaxy in the cosmos.

And 90% of the Internet is made up of dark matter too—hard for institutions to see, but so forceful that it seems to move humanity itself.

And it’s not necessarily that the glass of museum, library, and archive technology projects is half empty, as opposed to half full; it’s the fact that the glass of the Internet and the dark matter of open, social, read/write cultural engagement is so much bigger than museums, libraries, and archives are accustomed to seeing and thinking about. And the glass keeps growing at exponential speed, whether we fill it with good work or wait in committee meetings for the water to pour itself…

Edson concludes that museums, libraries, and archives "can play a huge role in the story of how Earth’s 7 billion citizens will lead their lives, make and participate in their culture, learn, share, invent, create, cry, laugh, and do in the future" by going back to Tim Berners-Lee's original vision of the Web, where every person can be a writer as well as a reader.

Cultural Web sites, blogs, Google, Facebook, Twitter and are part of the solution, but Edison's challenge goes beyond that.

I believe there are three parts to his challenge:

The role of trail blazer: Just as Bush suggested in July 1945, I believe there's a need for people to act as explorers, guides, and trail blazers over knowledge they know and love. You can experience that personal knowledge and passion on a tour, at a talk, or in a conversation on a bus, at a party - anywhere you meet someone who loves one of these institutions. I think it's particularly valuable to have trail blazers who are also skilled professionals personally represent and communicate the values, knowledge, and heritage of their museum, just as a great reference librarian becomes a library's ambassador.

The medium: Museums have long had lectures, journals, and newsletters. Most cultural institutions now have web sites, blogs, and Twitter or Facebook accounts, which can be really interesting depending on who does the writing and response. In Dark Matter Edson goes well beyond the comfort zone of most museums into the world of video blogging, Reddit, Pinterest, Tumbler and more. Of the video blogging brothers who created 1,000 plus videos on the YouTube Vlogbrothers channel, Edson writes:

It is evident from watching 30 seconds of any of their videos that they are nerds, and they proudly describe themselves as such. If you announced to your museum director or boss that you intended to hire Hank and John Green to make a series of charming and nerdy videos about literature, art, global warming, politics, travel, music, or any of the other things that Hank and John make videos about you would be thrown out of whatever office you were sitting in and probably be asked to find another job.

The mission: A little less than a year before the end of World War II, President Franklin D. Roosevelt wrote a letter to Vannevar Bush, asking Bush how to turn the "unique experiment of team-work and cooperation in coordinating scientific research and in applying existing scientific knowledge" during WWII to the peaceful pursuit of scientific knowledge after the end of the war. President Roosevelt concluded: "New frontiers of the mind are before us, and if they are pioneered with the same vision, boldness, and drive with which we have waged this war we can create a fuller and more fruitful employment and a fuller and more fruitful life." Bush responded to then President Harry S. Truman in July 1945, the same month As We May Think was published in The Atlantic Monthly. Bush's report, titled Science the Endless Frontier, lead to the creation of the National Science Foundation.

The Dark Matter mission is different, but it calls on museums and other cultural institutions to rethink how they bring together the heritage they preserve and the broader society they serve. I believe that the skills and passion of trail blazers can help connect the people and the common record of their culture by creating trails that can be seen and built upon now and by future generations. Anyone can now create a trail, and museums should become the richest and most welcoming sources for trail creation. Museums can help by opening up access as well as by creating and curating trails - across all media - as part of their core mission, a unique experiment in team-work and cooperation.

See Dark Matter and Trailblazers - @mpedson and Vannevar Bush for more quotes from Michael Peter Edson's essay, quotes from As We May Think, and President Roosevelt's wartime letter to Vannevar Bush.

Update Oct 30, 2014 See Michael Peter Edson's Internet Librarian International 14 keynote slides, Dark Matter 

Update Jan 21, 2015: See The Museum of the Future Is Here by Robinson Myer, The Atlantic, Jan 20, 2015. A thoughtful redesign of the Smithsonian's Cooper Hewett museum adds a stable URL for every object in its collection, as well as an API for accessing related content.

What the API means, for someone who will never visit the museum, is that every object , every designer , every nation , every era , even every color has a stable URL on the Internet. No other museum does this with the same seriousness as the Cooper Hewitt. If you want to talk about Van Gogh’s Starry Night online, you have to link to the Wikipedia page . Wikipedia is the best permanent identifier of Starry Night-ness on the web. But if you want to talk about an Eames Chair, you can link to the Cooper Hewitt’s page for it ...

“When we re-open, the building will be the single largest consumer of the API,” said Chan.

In other words, the museum made a piece of infrastructure for the public. But the museum will benefit in the long term, because the infrastructure will permit them to plan for the near future.

And the museum will also be, of course, the single largest beneficiary of outsider improvements to the API. It already talks to other APIs on the web. Ray Eames’s page , for instance, encourages users to tag their Instagrams and Flickr photos with a certain code. When they do, Cooper Hewitt’s API will automatically sniff it out and link that image back to its own person file for Eames. Thus, the Cooper Hewitt’s online presence grows even richer.

More

"Thoughtvectors in Concept Space badge" by @iamTalkyTina my posts | thoughtvectors.net

Related

As We May Think - Vannevar Bush, Atlantic Monthly, July 1, 1945

Reinventing the Web - Blog post on the creation and evolution of the Web and thoughts on making the Web a more writerly medium based on Berners-Lee's original intent and the vision of Ted Nelson.

Doug Engelbart's copy of As We May Think - with Doug's 1962 notes scribbled in the margins - Blog post also includes links to the Oct 1995 Brown/MIT Vannevar Bush Symposium on the 50th anniversary of As We May Think, with videos of talks and panel sessions.

Doug Engelbart | 85th Birthday Jan 30, 2010 - Blog post celebrating Doug Engelbart's 85th birthday, includes quotes and links to resources. One of the quotes from Engelbart's talk at the Brown/MIT Vannevar Bush Symposium became the tag line for this VCU course:

Doug Engelbart: ... So, moving your way around those thought vectors in concept space - I'd forgotten about that

Alan Kay: You said that, right?

Doug Engelbart: I must have, its so good. [laughter] Its to externalize your thoughts in the concept structures that are meaningful outside and moving around flexibly and manipulating them and viewing them. Its a new way to operate on a new kind of externalized medium. So, to keep doing it in a model of the old media is just a hangup that someplace we're going to break that perspective and shift and then the idea of high performance and the idea of high performance teams who've learned to coordinate, to get that ball down the field together in all kinds of operations.

Remembering Doug Engelbart, 30 January 1925 - 2 July 2013

July 4, 2013 · · Posted by Greg Lloyd

ImageI was very sad to learn that Doug Engelbart passed away at his home on 2 July 2013. Doug had a long life as a visionary engineer, inventor, and pioneer of technology we use every day - and technology where we're just starting to catch up to Doug and his SRI team in 1968. Doug had a quiet, friendly, and unassuming nature combined with deep knowledge, iron will, and a determination to pursue his vision. His vision was to aid humanity in solving complex, difficult and supremely important problems; Doug's goals were noble and selfless. The sense of dealing with an Old Testament prophet - a kindly Moses - is perhaps the greatest loss I and countless others who have met and been inspired by Doug feel today. I've written frequently about Doug in the past, and I'll continue to do so. Here are a few remembrances and resources that seem appropriate. I'll update this list over the next several days. Farewell Doug and my sincere condolences to his family and many friends.

“Someone once called me ‘just a dreamer’. That offended me, the ‘just’ part; being a real dreamer is hard work. It really gets hard when you start believing in your dreams.” — Doug Engelbart, Dreaming of the Future, Byte, September 1995.

Press and public valediction

DOUGLAS C. ENGELBART, 1925-2013 Computer Visionary Who Invented the Mouse John Markoff, New York Times, 3 July 2013. "It was his great insight that progress in science and engineering could be greatly accelerated if researchers, working in small groups, shared computing power. He called the approach “bootstrapping” and believed it would raise what he called their “collective I.Q.”"

In Memoriam: Douglas Engelbart, Maestro of the Mouse and So Much More Harry McCracken, Time, 3 July 2013. "Engelbart was able to see things that most people couldn’t, and make them real. But he was also a passionate believer in what he called Collective IQ — the ability of teams to do things that lone guns cannot.

Computing pioneer and GUI inventor Doug Engelbart dies at 88 Dylan Tweeny, VentureBeat.com, 3 July 2013. "Although Engelbart is often referred to as the inventor of the mouse, that’s a bit like saying Henry Ford was the inventor of the steering wheel. The mouse was a clever invention, but it was merely one component of a larger vision of how computers could increase human intelligence, or what Engelbart called our collective IQ."

Doug Engelbart, visionary Robert X. Cringley, I Cringley, 3 July 2013. "To most people who recognize his name Doug Engelbart was the inventor of the computer mouse but he was much, much more than that. In addition to the mouse and the accompanying chord keyboard, Doug invented computer time sharing, network computing, graphical computing, the graphical user interface and (with apologies to Ted Nelson) hypertext links. And he invented all these things — if by inventing we mean envisioning how they would work and work together to create the computing environments we know today — while driving to work one day in 1950."

Chris Nuzum's fine valediction for Doug: "RIP Doug Engelbart, and thank you. For taking the time to walk a few miles after dinner in 1995 with a young admirer, for your urgent encouragement to do something about my ideas, for your generosity with your time in providing feedback and encouragement, and for the lifetime of work your poured yourself into with boundless enthusiasm and determination. Your inspiration lives on." See photo

Douglas Engelbart's Unfinished Revolution Howard Rheingold, MIT Technology Review 23 July 2013. "To Engelbart, computers, interfaces, and networks were means to a more important end—amplifying human intelligence to help us survive in the world we’ve created. He listed the end results of boosting what he called “collective IQ” in a 1962 paper, Augmenting Human Intellect. They included “more-rapid comprehension … better solutions, and the possibility of finding solutions to problems that before seemed insoluble.” If you want to understand where today’s information technologies came from, and where they might go, the paper still makes good reading."

Engelbart's First, Second and Third Order Problems. Jonathan Stray's 4 July 2013 Twitter valediction, a Storify collection with some links expanded. "First order is doing. Second is improving the doing. Third is improving the improving."

If you truly want to understand NLS, you have to forget today. Brett Victor wrote A few words on Doug Engelbart 3 July 2013 in honor of Doug Engelbart life and passing. A few very well chosen words. A Storify collection with a few links expanded and quoted.

"The least important question you can ask about Engelbart is, "What did he build?" By asking that question, you put yourself in a position to admire him, to stand in awe of his achievements, to worship him as a hero.

But worship isn't useful to anyone. Not you, not him. The most important question you can ask about Engelbart is, "What world was he trying to create?" By asking that question, you put yourself in a position to create that world yourself."

Doug Engelbart Resources

DougEngelbart.org: The Doug Engelbart Institute was was conceived by Doug Engelbart to further his lifelong career goal of boosting our ability to better address complex, urgent problems. It contains an excellent history, archive of papers, photos and other published resources as well as links to Doug's current projects.

Douglas Engelbart Interviewed by John Markoff of the New York Times Outracing the Fire: 50 Years and Counting of Technology and Change Computer History Museum oral history interview, March 26, 2002.

Doug Engelbart Video Archive: 1968 Demo - FJCC Conference Presentation Reel Dec 9, 1968 Internet Archive, the so called Mother of All Demos. See also From Pranksters to PCs chapter about Engelbart's 1968 FJCC demo from John Markoff's book What the Dormouse Said: How the 60s Counterculture Shaped the Personal Computer Industry, authorized excerpt.

Video Archive MIT / Brown Vannevar Bush Symposium: A Celebration of Vannevar Bush's 1945 Vision, An Examination of What Has Been Accomplished, and What Remains to Be Done. Oct 12-13 1995, MIT. Talks and panel discussion with Doug Engelbart, Ted Nelson, Andy van Dam, Tim Berners-Lee, Alan Kay and others. See also ACM Interactions summary (free access), transcript of day 1 and day 2 panels.

Augmenting Human Intellect: A Conceptual Framework. by Douglas C. Engelbart, October 1962 (SRI AUGMENT, 3906) A work Doug referred to as the bible of his research agenda, it also outlines the motive for his work: enabling groups of people to respond to the increasingly complex and urgent problems of humanity. If you want to read Doug's original works, start here:

By "augmenting human intellect" we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble. And by "complex situations" we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers--whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human "feel for a situation" usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids. 1a1

Man's population and gross product are increasing at a considerable rate, but the complexity of his problems grows still faster, and the urgency with which solutions must be found becomes steadily greater in response to the increased rate of activity and the increasingly global nature of that activity. Augmenting man's intellect, in the sense defined above, would warrant full pursuit by an enlightened society if there could be shown a reasonable approach and some plausible benefits. 1a2

Traction Software Blog posts

Tricycles vs. Training Wheels Jon Udell writes: "Easy-to-use computer systems, as we conventionally understand them, are not what Engelbart had in mind. You might be surprised to learn that he regards today’s one-size-fits-all GUI as a tragic outcome. That paradigm, he said in a talk at Accelerating Change 2004, has crippled our effort to augment human capability." Doug's discussion with Alan Kay at the 50th Anniversary of As We May Think (including links).

Traction Roots - Doug Engelbart Elements of Doug's work that directly inspired Traction TeamPage, what we do, and how we work. A personal remembrance.

Original Traction Product Proposal - Annotated references and appendices on the work of Doug Engelbart and Ted Nelson.

Flip Test 1971 | Email versus Journal Doug Engelbart's Journal versus email - an alternate history.

And here's what Enterprise 2.0 looked like in 1968 | Dealing lightning with both hands... The 1968 Mother of All Demos and John Markoff's What the Dormouse Said

Enterprise 2.0 Schism Doug Engelbart and Peter Drucker are the two patron saints of Enterprise 2.0. And why.

Doug Engelbart | 85th Birthday Jan 30, 2010 Doug Engelbart's mission, goals and accomplishments, including a dialog with Alan Kay at the 50th Anniversary of As We May Think symposium.

Doug Engelbart's copy of As We May Think - with Doug's 1962 notes scribbled in the margins From the Doug Engelbart digital archive (see links). Original donated to the Computer History Museum.

Happy Birthday Doug Engelbart! Video highlights from Doug's talk and panels at the 50th Anniversary of As We May Think symposium, Oct 1995. Videos of Doug's talks including his famous Dec 1968 Mother of All Demos are now part of the Doug Engelbart Digital Archive maintained and managed by The Internet Archive

Intertwingled Work

July 5, 2010 · · Posted by Greg Lloyd

Image Last week's post by Jim McGee Managing the visibility of knowledge work kicked off a nice conversation on Observable Work (using a term introduced by Jon Udell) including: my blog post expanding on a comment I wrote on Jim's post; Brian Tullis's Observable Work: The Taming of the Flow based on a comment Brian made on Jim's post, which he found from a Twitter update by @jmcgee retweeted by @roundtrip; a Twitter conversation using the hash tag #OWork (for "Observable Work"); John Tropea's comment back to Jim from a link in a comment I left on John's Ambient Awareness is the new normal post; Jim's Observable work - more on knowledge work visibility (#owork), linking back to Mary Abraham's TMI post and Jack Vinson's Invisible Work - spray paint needed post, both written in response to Jim's original post; followed by Jack Vinson's Explicit work (#owork) and Paula Thornton's Enterprise 2.0 Infrastructure for Synchronicity.

That's a bunch of links! But I include them for a reason. [ For anyone who finds the presence of inline links distracting, see Apology to the Easily Distracted, below. ]

This modest trail is not only observable - it's spread over about a dozen posts on eight unrelated blog servers using unrelated software, loosely coupled by conversations, links and hash tags observable in the Web commons known as Twitter. The only things that connect this trail are links, search, syndicated feeds and serendipity. In the words of Ted Nelson this is an intertwingled trail - although not very deeply intertwingled, and not that easy to follow.

That brings three points to mind:

1) The fact that "intertwingle" is an amusing word can obscure an important idea I believe Ted Nelson is a Casandra-like inventor blessed and cursed with a rapier wit and the ability to invent concepts and coin terms that stick deeply in peoples minds. Hypertext one of the terms Ted coined and concepts he invented - working independently from Doug Engelbart at about the same time - inspired by the work of Vannevar Bush.

One of Ted's mantras: "EVERYTHING IS DEEPLY INTERTWINGLED. In an important sense there are no "subjects" at all; there is only all knowledge, since the cross-connections among the myriad topics of this world simply cannot be divided up neatly." Ted Nelson, Computer Lib / Dream Machines, 1974

Although I think it's useful to believe in the existence of subjects, in the past, conversations could only be intertwingled across paper memos, faxes, written reports and email. Until the advent of the Web it wasn't possible to intertwingle conversations, networks, analysis and work in near-real time and global scale. Now that's trivial and essentially free with basic Web access.

2) The Web does what it's intended to do, so long as content is addressable and findable. The trail on observable work isn't stored in one specific place - but with a little effort it's possible to follow the flow and join the conversation.

The fact that blog posts and comments are created and served by different content server systems is irrelevant, so long as the content is addressable using basic Web standards. How the different servers store the addressed content internally is likewise irrelevant so long as they deliver the content using Web content standards.

The fact that you don't need a single common place to contain all trails is an advantage of the Web, not a disadvantage. It makes finding and linking harder, but creation and association infinitely easier than attempting to force the world into one "Observable Work" discussion area you create in one specific blog, wiki, forum, Wave or whatever.

The Web succeeds succeeds by making it possible for anyone anywhere to create a trail which others can find, follow and join using nothing more than their own Web browser, Web search layered over the basic Web, and a place like Twitter (one of many places where anyone can easily create visible, Web indexed trails).

The Web doesn't guarantee that you'll be aware of conversations on observable work going on in other trails unless you search or stumble upon a link which leads you connect the two. I don't follow discussions on LinkedIn, but might be alerted to something interesting there by someone I follow in a commons like Twitter where I do participate.

The fact that everything posted publicly on Web is potentially observable doesn't lead mean you have to deal with Too Much Information shoved in your face - or into your email box.

You choose who and what to follow, augmented by Web search and your ability to jump in and join or forget about and a trail at any time - although you might hold on to a link so it's easy to find the trail again if you change your mind.

3) Business context makes intertwingled work easier to create, discover and use. Unlike the public Web, work in a private or public organization has a purpose and context that can make intertwingled work easier to discover and talk about. Work and discussion in an organization generally take place in the context of broad business activities like sales, product development, research, finance or administration. Context in an enterprise can be represented as places where work and conversation takes place with reliable privacy aware search, tagging, linking, comments, status updates and activity streams.

I believe the important point is supporting business context - not business process in the sense of transactional workflow or automated systems. I believe that functionally specialized transactional systems in an organization will likely remain silos of structured information - but market forces will drive vendors to make their content addressable using simple Web standards and services - with consistent authentication and visibility based on context dependent business rules.

These functionally specialized systems will also signal their status using social computing standards that are now starting to take shape. This will push routine reporting and dealing with exceptions from transactional systems into the "social" places where people can stay informed, recognize issues and exceptions and decide what to do. In an ideal world, transactional systems would provide authenticated access to Web addressable content or analysis, signals based on routing activity or exceptions, Web sensible control interfaces - and not a much more. Most human access would be handled on the Web rather than transactional processing side. I believe the Web has become a valid, scalable and secure alternative to proprietary stacks for integrating most enterprise software at the user experience level.

Much of what a sociologist would call "social" behavior when talking about Enterprise 2.0 would naturally center on the sociology of work: how people communicate and interact with others while dealing with questions, issues, exceptions, suggestions and the messy stuff that routine transactional systems can't handle, along with interpersonal relationships that develop in a specific context or as member of an extended enterprise (including customers, suppliers, consultants and external as well as internal stakeholders).

On top of relationships based well established patterns of work and conversation - Andrew McAfee's strong ties - enterprise social software opens the door to discovering people and groups who most folk in a large organization would never meet face to face.

This offers the same opportunities for serendipitous discovery we see on the public Web, but with privacy in context which enables open discussion and shared goals and purpose that are part of what Peter Drucker calls the purpose of an organization: "The purpose of an organization is to enable ordinary humans beings to do extraordinary things."

Much of what's challenging about using "observable work" principles can be addressed by examples at top, middle and grassroots levels of an organization. What's needed is a willingness to tolerate and encourage observable work in the small under local control, and leadership to make it an enterprise norm.

As Paula Thornton says: "For as much as people want to make Enterprise 2.0 about technologies, then I’m willing to concede this: Enterprise 2.0 is the means by which to achieve Work 2.0 to deliver Business 2.0."

To be continued Jim, Brian, John, Mary, Jack, Paula, Mark, Gordon, Rawn, Jose, JP, Tom, Deb and the rest of the World - over to you. The best way to follow the evolution of the Observable Work trail is Twitter's #OWork tag. All of the participant's seem to use Twitter as a commons linking posts that either directly respond to the Observable Work conversation, or are related in some interesting way, such as Tom Peter's Strategy: Space Matters ("who sits next to whom in your office can make a huge difference"), JP Rangaswami's Musing about learning by doing, Deb Lavoy's Common Operating Picture - share facts, debate possibilities, John Tropea's link to Keith Swanson's excellent slide set, and John's soon-to-be-published post on Adaptive Case Management.

Unfortunately neither Twitter nor Google's hash tag search seems complete and reliable. So far as I can tell not all Tweets mentioning are found by either service. There's room for improvement on the public Web as well as the Enterprise 2.0 domain.

Apology to the Easily Distracted: Readers who find embedded links distracting don't have to click while reading the paragraph. I apologize if using the Web to source references that would be unimaginably difficult to provide in any other medium is a distraction. I believe it's not hard to exercise a little discipline when reading, then go back and click any links where you'd like to dive deeper based on your interests. I like to put a small number of See Also links at the bottom of posts where you can dive deeper if you choose.

Related

The Work Graph Model: TeamPage style - "...A work graph consists of the units of work (tasks, ideas, clients, goals, agenda items); information about that work (relevant conversations, files, status, metadata); how it all fits together; and then the people involved with the work (who’s responsible for what? which people need to be kept in the loop?)"

Reinventing the Web - Ted Nelson, Tim Berners-Lee and the evolution of the Web. The Web rightly won versus "better" models by turning permanence into a decentralized economic decision.

Reinventing the Web II - Why isn't the Web a reliable and useful long term store for the links and content people independently create? What can we do to fix that? Who benefits from creating spaces with stable, permanently addressable content? Who pays? What incentives can make Web scale permanent, stable content with reliable bidirectional links and other goodies as common and useful as Web search over the entire flakey, decentralized and wildly successful Web?

The Future of Work Platforms: Like Jazz - The social dance of getting things done, dealing with exceptions, and staying aware of what’s going around you

Fixing Enterprise Search - Context and addressable content in functional line of business systems

Enterprise 2.0 - Letting hypertext out of its box - Hypertext and the Web

User Experience Standards for Social Computing in the Enterprise Notes for Mike Gotta E2.0 Boston 2010 panel

Enterprise 2.0 and Observable Work - A riff on themes from Jim McGee and Jon Udell

29 July 2010 | Enterprise 2.0 and Observable Work: Brian Tullis and Joe Crumpler, Burton Group Catalyst 2010 Santa Diego

Doug Engelbart | 85th Birthday Jan 30, 2010

January 30, 2010 · · Posted by Greg Lloyd

Image"DOUG Engelbart sat under a twenty-two-foot-high video screen, "dealing lightning with both hands." At least that's the way it seemed to Chuck Thacker, a young Xerox PARC computer designer who was later shown a video of the demonstration that changed the course of the computer world." from What the Dormouse Said, John Markoff.

Doug Engelbart has been recognized as a great figure in the history of technology with awards including the National Medal of Technology presented by President Bill Clinton for: "... creating the foundations of personal computing including continuous real-time interaction based on cathode-ray tube displays and the mouse, hypertext linking, text editing, on-line journals, shared-screen teleconferencing, and remote collaborative work."'

Doug is also noble figure. Doug's research has a moral purpose, reflecting the skills and attitude of a great engineer and humanitarian: If the world is faced with complex, intractable problems that challenge the ability of individuals and nations to solve, what can I do to help people fix what's broken?

Doug's research focuses on how computers can aid people's ability to think and work as groups as well as individuals - what Doug refers to as Augmentation rather than Automation. This involves understanding how problem solving groups actually behave - and how introducing new technology changes behavior and vice versa. This led me to nominate Doug along with Peter Drucker as a patron saint of Enterprise 2.0; the phrase is flip but the thought is serious. Please send 85th Birthday greetings to Doug today.

I'll let Doug speak for himself in the opening paragraphs of what he calls the bible of his research agenda, AUGMENTING HUMAN INTELLECT: A Conceptual Framework from Oct 1962:

By "augmenting human intellect" we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble. And by "complex situations" we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers--whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human "feel for a situation" usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids.1a1

Man's population and gross product are increasing at a considerable rate, but the complexity of his problems grows still faster, and the urgency with which solutions must be found becomes steadily greater in response to the increased rate of activity and the increasingly global nature of that activity. Augmenting man's intellect, in the sense defined above, would warrant full pursuit by an enlightened society if there could be shown a reasonable approach and some plausible benefits.1a2

This report covers the first phase of a program aimed at developing means to augment the human intellect. These "means" can include many things--all of which appear to be but extensions of means developed and used in the past to help man apply his native sensory, mental, and motor capabilities--and we consider the whole system of a human and his augmentation means as a proper field of search for practical possibilities. It is a very important system to our society, and like most systems its performance can best be improved by considering the whole as a set of interacting components rather than by considering the components in isolation. - AUGMENTING HUMAN INTELLECT: A Conceptual Framework, Douglas Engelbart October 1962

And notes from a conversation with Alan Kay - one of the two thousand people who attended Doug's Dec 1968 Demo, and went on to shape the world of technology as we know it.

Alan Kay At PARC one of the goals was to do NLS as a distributed system and all of the ALTOs had the five-finger keyboards as well as the mouse on them. We basically loved NLS and we'd done a few modifications which we thought even sped up. NLS part of the interaction scheme on it was, I believe, because the analog mouse there was some drift in it, so one of the things that they did was to say what kind of a thing you were pointing at, so you'd say move character or move word or move paragraph and so forth. It was kind of a procedure where you gave the command first and then bug bug and then command accept. We realized at Xerox PARC that you wanted to have a speedy scheme for interacting and we thought we could go even one better by selecting the objects, so you'd select something you'd do something to, give the command and then, in the case of move character you'd go select, move, select and it would it with fewer keystrokes.

Now, the abortion that happened after PARC was the misunderstanding of the user interface that we did for children, which was the overlapping window interface which we made as naive as absolutely we possibly could to the point of not having any work flow ideas in it and that was taken over uncritically out into the outside world. So we have many systems, like Lotus Notes and many mail systems that when you say replay it comes up with a window over the very thing you were reading as though there weren't any connection between these things. So this is an abortion to me, but its basically part of the whole feel. Whereas our notion was that you start the kids off on this fairly simple, naive thing and then there would be an actual progression where you would get up to this several commands a second kind of thing that you could do with NLS. If you have ever seen anybody use NLS it is really marvelous cause you're kindof flying along through the stuff several commands a second and there's a complete different sense of what it means to interact than you have today. I characterize what we have today as a wonderful bike with training wheels on that nobody knows they are on so nobody is trying to take them off. I just feel like we're way way behind where we could have been if it weren't for the way commercialization turned out.

Doug Engelbart Well, strangely enough, I feel the same. It's part of the thing of the easy to learn and natural to use thing that became sort of a god to follow and the marketplace is driving it and its successful and you could market on that basis, but some of the diagrams pictures that I didn't quite get to the other day was how do you ever migrate from a tricycle to a bicycle because a bicycle is very unnatural and very hard to learn compared to a tricycle, and yet in society it has superseded all the tricycles for people over five years old. So the whole idea of high-performance knowledge work is yet to come up and be in the domain. Its still the orientation of automating what you used to do instead of moving to a whole new domain in which you are going to obviously going to learn quite a few new skills. And so you make analogies of suppose you wanted to move up to the ski slopes and be mobile on skis. Well, just visiting them for an afternoon is not going to do it. So, I'd love to have photographs of skateboards and skis and windsurfing and all of that to show you what people can really if they have a new way supplied by technology to be mobile in a different environment. None of that could be done if people insisted that it was an easy-to-learn thing.

So, moving your way around those thought vectors in concept space - I'd forgotten about that

Alan Kay You said that, right?

Doug Engelbart I must have, its so good. [laughter] Its to externalize your thoughts in the concept structures that are meaningful outside and moving around flexibly and manipulating them and viewing them. Its a new way to operate on a new kind of externalized medium. So, to keep doing it in a model of the old media is just a hangup that someplace we're going to break that perspective and shift and then the idea of high performance and the idea of high performance teams who've learned to coordinate, to get that ball down the field together in all kinds of operations. I feel like the real breakthrough for us getting someplace is going to be when we say 'All right, lets put together high-performance, knowledge-work teams and lets pick the roles they're going to play within our organizations in some way in such even though they operate very differently from their peers out in the rest of the organization they can interact with them and support them very effectively. So there are roles like that that would be very effective and everyone else can sortof see because they're interacting with these guys what they can do. And suppose it does take 200 hours of specialized training - that's less than boot camp.

One of those boxes on that paradigm map about deployment was really coming down and showing you that special purpose teams are one kind of thing in the way that they can propagate and very different from moving a group of people who have an existing set of staff and processes and methods and skills and equipment and trying to move them all together. It's practically an impossible task to do that in any significantly large step without having casualties. They just aren't all equipped to mobile in that space. So, there's a lot to go with that and it all stems from looking at today and saying 'why do we accept that?' That's the modern thing, its almost a religion. In any other company I'd be afraid to bring that out. Maybe I'll have to run from you too... from Notes from the Panels The Brown / MIT Vannevar Bush Symposium, October 1995

Update Remembering Doug Engelbart, 30 January 1925 - 2 July 2013

Doug Engelbart Resources

DougEngelbart.org: The Doug Engelbart Institute was was conceived by Doug Engelbart to further his lifelong career goal of boosting our ability to better address complex, urgent problems. It contains an excellent history, archive of papers, photos and other published resources as well as links to Doug's current projects.

Douglas Engelbart Interviewed by John Markoff of the New York Times Outracing the Fire: 50 Years and Counting of Technology and Change Computer History Museum oral history interview, March 26, 2002.

Doug Engelbart Video Archive: 1968 Demo - FJCC Conference Presentation Reel Dec 9, 1968 Internet Archive, the so called Mother of All Demos. See also From Pranksters to PCs chapter about Engelbart's 1968 FJCC demo from John Markoff's book What the Dormouse Said: How the 60s Counterculture Shaped the Personal Computer Industry, authorized excerpt.

Video Archive MIT / Brown Vannevar Bush Symposium: A Celebration of Vannevar Bush's 1945 Vision, An Examination of What Has Been Accomplished, and What Remains to Be Done. Oct 12-13 1995, MIT. Talks and panel discussion with Doug Engelbart, Ted Nelson, Andy van Dam, Tim Berners-Lee, Alan Kay and others. See also ACM Interactions summary (free access), transcript of day 1 and day 2 panels.

Doug Engelbart's copy of Vannevar Bush's 1945 As We May Think, with Doug's 1962 notes scribbled in the margins.

Tuesday Dec 9, 2008 | Forty years after the Mother of All Demos Engelbart's demonstration of the Augment shared screen hypertext and video system developed by a team at SRI under Doug's leadership. Links to videos, interviews and other resources

AUGMENTING HUMAN INTELLECT: A Conceptual Framework By Douglas C. Engelbart October 1962 (SRI AUGMENT, 3906)

And yes, Doug also invented the mouse, and used it in his 1968 demo. But introducing Doug as the inventor of the mouse is like introducing Leonardo da Vinci as a guy who knew how to make good paint brushes.

Related

Enterprise 2.0 Schism

November 9, 2009 · · Posted by Greg Lloyd

Image I have to confess that I've enjoyed watching recent rounds of Enterprise 2.0 discussion and mud wrestling. The fact that so many people enjoy debating definitions, values, doctrinal principals - even the existence of Enterprise 2.0 - makes me think that E2.0 might best be framed as a religious debate. With that in mind, I'd like to introduce a new and exciting element: schism.

I hereby declare myself an Enterprise 2.0 Strict Druckerian. I believe that "2.0" should be considered a modifier of Enterprise rather than an allusion to mere Web 2.0 technology - which is what an Enterprise 2.0 Strict Solutionist would have you believe.

I further declare: No, it is not "all about the people" - which is what an Enterprise 2.0 Strict Proletarian would have you believe. Without the enabling technology of the Web, plus search engines and other affordances based on Sir Tim Berners-Lee's innovation, the Strict Proletarian would find it difficult to fit the inhabitants of McAfee's inner, middle and outer rings into the same room, get them to participate in the same conference call, or exhibit their "emergent" behaviors using typewriters, copy machines, faxes and email. Speed, scale and connection patterns matter and the technology that spans these barriers is neither trivial nor insignificant to the phenomena Strict Proletarians value.

I believe that although both technology and broad bottom-up participation are necessary to achieve the Drukerian vision, neither element alone is sufficient to achieve the noble end of re-engineering how ordinary people work together to achieve the ends of enterprises they choose to affiliate with.

As Peter Drucker said: "The purpose of an organization is to enable ordinary human beings to do extraordinary things." Management: Tasks, Responsibilities, Practices Chapter 28, The Spirit of Performance, p. 361 (1974)

I nominate Peter Drucker and Douglas Engelbart as Patron Saints of Enterprise 2.0 (Strict Druckerian). If you don't know who either of these gentlemen are, I suggest you click their Wikipedia links for two pretty good short biographies.

Peter Drucker constantly advised businesses to give employees direct control over their own work and environment, with teams of "knowledge workers" responsible for work toward goals stated as broad business objectives rather than prescriptive plans. Drucker stated that management could only achieve sustainable profits by treating people as an enterprise's most valued resources, not as costs. In later years he described his role as "social ecologist" rather than management consultant.

"Marketing alone does not make a business enterprise. In a static economy there are no business enterprises. There are not even businesspeople. The middleman of a static society is a broker who receives his compensation in the form of a fee, or a spectator who creates no value.

A business enterprise can exist only in an expanding economy, or at least in one that considers change both natural and acceptable. And business is the specific organ of growth, expansion and change.

The second function of a business is, therefore innovation - the provision of different economic satisfactions. It is not enough for the business to provide just any economic good and services; it must provide better and more economic ones. It is not necessary for a business to grow bigger; but it is necessary that it constantly grow better...

Above all innovation is not invention. It is a term of economics rather than technology. Non technological innovations - social or economic innovations - are at least as important as technological ones.

In the organization of a business enterprise, innovation can no more be considered a separate function than marketing. It is not confined to engineering or research, but extends across all parts of the business, all functions, all activities." Peter Drucker, Management: Tasks, Responsibilities, Practices (1974)

At a 1934 Cambridge seminar by John Maynard Keynes, "I suddenly realized that Keynes and all the brilliant economic students in the room were interested in the behavior of commodities, while I was interested in the behavior of people." Peter Drucker, The Ecological Vision, p. 75-76, (1993)

"A manager's task is to make the strengths of people effective and their weakness irrelevant--and that applies fully as much to the manager's boss as it applies to the manager's subordinates." Peter Drucker, Managing for the Future: The 1990's and Beyond (1992)

In an equally distinguished career, Douglas Engelbart has been enormously influential in creating and inspiring the creation of technology we use today (far beyond his invention of the mouse), but Doug's goals have always been expressed in terms of improving the abilities of groups to address complex, difficult and important problems:

"By 'augmenting human intellect' we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble. And by 'complex situations' we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers--whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human 'feel for a situation' usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids." Douglas Engelbart Augmenting Human Intellect: A Conceptual Framework, Introduction, (1962)

On the term "social software", I believe it's fair to blame it on Clay Shirky - who had the misfortune to introduce a term that's perfectly respectable for a sociologist who studies how technology influences group behavior:

“It's software that supports group interaction. I also want to emphasize, although that's a fairly simple definition, how radical that pattern is. The Internet supports lots of communications patterns, principally point-to-point and two-way, one-to-many outbound, and many-to-many two-way.” − Clay Shirky, A Group Is Its Own Worst Enemy O’Reilly Conference (April 2003)

If the term "social" must be deprecated, I hope its banishment takes with it all Social X marketing buzzwords, job titles, twitter tags, and the well-earned disco ball reputations of the so-called Social Media gurus.

On "Return on investment" debates, I believe that Taylorist time-and-motion studies would show gains that exceed the modest costs of introducing and using Enterprise 2.0 software. However, for knowledge work where the potential business value is much greater than transactional (e.g. reduced time to handle a purchase order) value studies are difficult to design and far too easy to fudge. Long term experimental studies measuring business improvement are even more difficult:

"A very important surgeon delivered a talk on the large number of successful procedures for vascular reconstruction. At the end of the lecture, a young student at the back of the room timidly asked, 'Do you have any controls?' The great man hit the podium and said, 'Do you mean, "Did I not operate on half the patients?"' ... The hall grew very quiet and the voice at the back of the room very hesitantly replied, 'Yes, that's what I had in mind.' The surgeon's fist really came down as he thundered, 'Of course not, that would have doomed half of them to their death!'...The room was then quiet, and one could scarcely hear the small voice ask, 'Which half?'" - Dr. E. E. Peacock, Jr., University of Arizona College of Medicine; quoted in Medical World News, p. 45 (September 1, 1972) quoted by Edward Tufte in Beautiful Evidence (2006)

I believe the value of Enterprise 2.0 techniques comes from small to mid size groups within an organization who intentionally strive to improve their own ability to get work done, while opening the direct and indirect record of their work to others who then may become better aware of what their enterprise plans to do, is doing or has done - and who knows what.

Finally - having demonstrated the unerring truth of the Strict Druckerian position regarding the nature of Enterprise 2.0, I declare both the Strict Solutionist and Strict Proletarian interpretations to be false, heretical, and anathema. Living in our tolerant and civilized times, I found it difficult to imagine an appropriate way to separate those who obstinately cling to these heretical beliefs, until I ran across this nugget:

Nike does "email archeology" to decompose email thread to expose one part of a specific collaboration. :>) @lehawselive (4:20pm Nov 4, 2009)

So if you don't agree with me, I hope you spend the the rest of your corporate life decomposing email threads from your corporate archive into Google Waves or Traction TeamPage comments where others can benefit from your labor if not from your ideas.

More

[ And so much more. It's the Web - you could look it up - or follow the fun on Twitter ]

Related

Afterword

This was far too much fun to write. I hope I haven't needlessly offended anyone, but I'm also happy to defend the essence of the Druckerian position in more serious terms; Enterprise 2.0 is a big tent and I hope it stays that way.

I also value the term Enterprise 2.0 for a reason over and above the Druckerian fantasy. Unlike terms invented to express a desire to sell software to managers (X Management - you do want to manage X don't you?), Enterprise 2.0 expresses a simple, grounded wish:

"I wish the software I used every day at work allowed me to find what I want; discover what I need to know - along with surprises; and connect with people I don't even know to get my job done, learn more, and work in an enjoyable place." or much more narrowly: "Why can I find what I need with Google on the Web, but have to pull teeth to find anything useful when I go to work?"

This is a grounded wish since everyone in business has a direct basis for comparison - what they or their children see, use and enjoy on the public Web every day. This doesn't mean that expectations, behavior, and (uh sociology) of the public Web and the internal/external web of connections used in an enterprise are the same - but they are comparable with respect to desired experience.

To the extent that corporate barriers dash expectations, read Peter Drucker on how to get rid of those barriers or find a better employer.

To the extent that enterprise technology differs with respect to needs for privacy, finding information in a link-deprived environment and sharing access to confidential sources or legacy applications, Enterprise 2.0 offers the opportunity for vendors and community projects to create products that respond to that simple, grounded wish and measure the difference.

I'm not sure where Professor Andrew McAfee sees himself in this ecclesiastical model. I'd be happy to support his claim to any sub-numinous position.

Update Remembering Doug Engelbart, 30 January 1925 - 2 July 2013

Update 6 Jun 2013: In the original version of this post I used Strict Technarian to refer to those who believe there is a purely technical - specifically Web or Internet - solution to every problem. Since then the term solutionist has gained popularity, generally through the acerbic criticism of Evgeny Morozov. I switched the awkward Technarian to Solutionist.

Although I don't agree with all of what Morozov says - or the way he says it - I believe solutionist is a useful term. See James Temple's 3 April 2013 SFChronical.com column Why Silicon Valley needs critics like Morozov

Update 21 Nov 2014: Enterprise 2.0 - Are we there yet? Is there a 'there' for Enterprise 2.0, or is it more like shaking a sleepy beehive?

Reinventing the Web

January 12, 2009 · · Posted by Greg Lloyd

Image
John Markoff wrote a really good Jan 11 2009 New York Times profile, In Venting, a Computer Visionary Educates on Ted Nelson and his new book, Geeks Bearing Gifts: How the Computer World Got This Way (available on Lulu.com). Markoff notes that Tim Berners-Lee invented the World Wide Web, but: "Lost in the process was Mr. Nelson’s two-way link concept that simultaneously pointed to the content in any two connected documents, protecting, he has argued in vain, the original intellectual lineage of any object... His two-way links might have avoided the Web’s tornado-like destruction of the economic value of the printed word, he has contended, by incorporating a system of micropayments."

I was one of the skeptics who thought that the World Wide Web with its fragile one-way links would never take off as a global hypertext platform. Classic hypertext systems (from HES and Augment though Xanadu, Plato, Intermedia, Lotus Notes, and Dynatext) went to great lengths to preserve the integrity of links, relationships, and content.

The idea that any sensible person would rely on a global hypertext system where links on one computer pointed at locations on another computer which would break whenever the remote computer was unilaterally moved, renamed, taken off line or abandoned seemed absurd.

The idea that you would have no way to know what incoming links would break when editing or refactoring content seemed just as bad.

The Word Wide Web protocols looked like they would work for relatively small cooperative groups like CERN who could keep things from breaking by having shared goals, and using peer pressure plus out of band communication to keep distributed content alive.

Actually that intuition was pretty good, because the World Wide Web took off in a direction based on other incentives compatible with those assumptions - and grew like crazy because unlike alternatives, it was was simple, massively scalable, cheap and eliminated the need for centralized control.

1) The Web became a distributed publishing medium, not the fabric for distributed editing and collaboration that Tim Berners-Lee and others envisioned. People and Web publishing engines like Amazon created content and kept it online while it had economic value, historical value (funded by organizations), or personal value. Content hosting became cheap enough for individuals or tiny groups. Advertising supported content became "free".

2) Search engines spanned the simple Web. Keeping content addressable now gained value since incoming links not only allowed people to bookmark and search engines to index what you had to publish (or sell), but the incoming links gained economic value through page rank. This provided even greater motivation to edit without breaking links, and to keep content online while it retained some economic, organizational or personal value.

3) People and organizations learned how to converse and collaborate over the Web by making it easy to create addressable content others could link to. The simple blog model let people just add content and have it automatically organized by time. The Wiki model required more thought and work to name, organize and garden content, but also creates stable, addressable islands of pages based on principals that reward cooperative behavior.

4) Search engines, syndication and notification engines built over the Web's simple, scalable protocols connected the Web in ways that I don't think anyone really anticipated - and work as independent and competing distributed systems, making rapid innovation possible.

Tim Berners-Lee made an inspired set of tradeoffs. Almost every concept of value on the Web: search engines, browsers, notification is built over his simple, open, highly scalable architecture.

I believe it's possible to provide what TBL calls "reasonable boundaries" for sharing sensitive personal or organizational data without breaking basic W3C addressable content protocols that makes linking and Web scale search valuable. That should be the goal for social and business software, not siloed gardens with Web proof walls.

As TBL said in a Jan 2013 interview: “The web isn’t about just sharing everything, destroying privacy… [but] if I want to share something with you it shouldn’t be the technology that gets in the way.

So when people ask what will deliver two-way links, fine grain comments and tagging, traceable transclusion and the promise of the Semantic Web, I suggest an approach which layers these hypertext capabilities over the basic Web in way that exposes readable content which is absolutely compatible with the basic Web for all readers and existing engines.

Offer seamless collaborative editing, traceability, semantic search and other capabilities by extending the hypertext editing engines to support new layered protocols and transparently downsample richer models to deliver basic Web content to clients who use basic Web protocols. Offer extended formats and services to client or other servers with extended capabilities.

I'm sure that won't satisfy Ted, but before a sea change in the basic structure of the Web - which is what Nelson and other's global visions require - I believe you'll have to be satisfied with stable islands in the Web's storm tossed sea and protocols that support robust connections among islands.

I believe it's even possible to implement Ted's micropayment transclusion model as a layered protocol. People's DRM aversion, rights contracting and enforcement seem to be bigger issues than the technical barriers.

I also believe that Enterprise 2.0 secure collaboration and social networking provide the motivation to make this new way to think of reinvention of the Web a reality.

Traction TeamPage was designed from the start to use layered principles, working with and over the Web without sacrificing (internal) two-way links, paragraph grain comments, tagging and relationships, content journaling, spaces with role based borders, and other capabilities that match and better capabilities of classic hypertext systems. Consider TeamPage a proof of concept.

I hope that the evolution of Enterprise 2.0 platforms leads to definition of layered protocols which extend valuable hypertext capabilities across hypertext systems - Traction's and others - to extend the Web for everyone's use and remember the lessons of simplicity, scalability and innovation that the Web has taught us all.

For additional thought's see Peter O'Kelly's comments on Markoff's profile, and Peter's excellent followup notes on the Web and Hypertext.

For more on how to intertwingle sites and services over the Web see Intertwingled Work and Enterprise 2.0 - Letting hypertext out of its box.

Update 14 Jul 2014: See Reinventing the Web II for follow on discussion and analysis.

Related

I originally titled this post "Re: In Venting the Web" - but chickened out - grl

Enterprise 2.0 - Letting hypertext out of its box

April 24, 2007 · · Posted by Greg Lloyd

Image In his Mar 26, 2006 post, Putting Enterprise 2.0 in Perspective, Mike Gotta agrees with Tom Davenport and Andrew McAfee that a balanced discussion of E2.0 should include "... how well an enterprise addresses the complex organizational dynamics that often inhibit change," not just "irrational exuberance regarding the technology."

That said, Mike has a slight disagreement with Andrew McAfee on the evolutionary versus revolutionary nature of E2.0 technology. McAfee says:

"My optimism, and my interest in the component technologies of E2.0, comes not (solely) from my inherent geekiness, but from the fact that these technologies really are something new under the sun. They’re not extensions or enhancements to previous generations of corporate tools for collaboration and knowledge management; instead, they’re radical departures from them. Technology platforms that are initially freeform and eventually emergent, that require no nerd skills to use, and that contain the SLATES elements I proposed a while back were born on the Internet just a couple years ago, and are now starting to make their way behind the firewall." - Andrew McAfee, I STILL Agree with Tom, And yet …

Gotta replies: ".. Tools emerging under the category of social software are benefiting from common application, infrastructure and network services that were not mature in the eighties and nineties. ... It is true that originally Notes was a self-contained environment (some would call it monolithic). Notes came with its own infrastructure, complete with its own repository and even dial capabilities for mobile users. At the time, directory, storage and other infrastructure services were not readily accessible to applications in any consistent fashion. Today, we would not engineer a product in that manner but there was no other option back then."

Gotta concludes: "Today, we have a new set of design criteria that allows us to focus on the social aspects of how people work together, share information and communicate across groups and networks. That design criteria exploits a more mature collection of application, infrastructure and networking services. Much of E.20 technology is evolutionary and in some ways, inevitable."

Characteristically, I agree with both of them. With Andrew, I believe there is a "radical departure" that distinguishes E2.0 technology from Lotus Notes, Groupwise, Intermedia, Hypercard, FRESS, Augment, HES and every other groupware and hypertext system. With Mike, I agree that: "As lower-level services become taken for granted, designers and developers are able to focus on software that exposes functionality that we now call 'Enterprise 2.0'."

I believe that the radical departure is the Web as the context of work: the universal medium, universal library, universal marketplace, and universal platform for personal as well as enterprise communication. After the rapid adoption of the read-mostly Web, we've seen the first use and rapid evolution of the Web as a platform for self and social expression.

Why not for work? I have nothing against new forms of self and social expression as emergent behavior in the workplace, but how about using Enterprise 2.0 technology for the every day work required to design, build, sell and maintain a product or deliver a service?

I believe the primary barrier to Enterprise 2.0 adoption for an established business purpose is The 9X Email Problem rather than hierarchy and a command and control mindset. And I believe that the Web as the context for work is what surmounts the 9X problem by exposing almost all of the relevant working communication and context to search, links, authoring, tags, extensions, and signals (McAfee's SLATES, see his 2006 Enterprise 2.0 the Dawn of Emergent Collaboration).

In every previous generation hypertext system from HES through Lotus Notes, the ability to read, search, link and communicate came with a terrible price: it might work well, but only if you put everything you wanted to work with into some sealed box, and convince everyone you wanted to work with to use the same box. From the earliest days of Vannevar Bush's Memex, the vision was universal, but implementations were a siloed. As Ted Nelson once said on the folly of using computers to simulate paper, Xerox PARC's paper simulation was followed by Apple's contribution:

"By tying little pictures of paper to files and the programs that created the files - Apple made things even worse. Now, instead of programs designed to work with just about any kind of file - mixing, matching and combining actions to do what people want - you have:

  • A program, and
  • A software company that owns the program
  • For every kind of file

Not just a simulation of paper, but multiple, incompatible simulations of paper!"

But the Web over the universal Internet turned the world-view of Lotus Notes (and the Sharepoint stack) inside out: no proprietary client, no proprietary representation, no requirement to work inside the proprietary box - and every motivation to make anything valuable you create or deliver compatible with the least common denominator representation outside the box: URL addressable HTML.

Core Web technology is not radical: http, HTML and the first generation of read / write web browsers and web servers could have been layered over the first generation DARPANet in the 1970's. Berners-Lee's URL and HTML Web framework is simpler than the corporate point-to-point communication infrastructure that preceded it (PROFS anybody?), and much simpler than the hypertext systems of the 1980's and 90's.

Enterprise 2.0 tools work because they use the basic Web as a platform that does not limit discourse, and can make the content of even the most specialized line of business systems more valuable by linking to them in context. For example, market forces drive makers of ERP systems, CAD repositories and analytic systems to at least make their content viewable and linkable using the Web. That's all that's necessary to add a link from a blog or wiki to a contextually relevant object or report. Search, links, authoring, tags, extensions and signals provide a mechanism for "weak signal amplification" and discovery that works even at internet scale, and can work at the intranet scale as the enterprise becomes a link friendly environment.

With appropriate attention to consistent identity and permissioned access, the same principles open up working communication between the internal stakeholders of an enterprise and their external customers, suppliers, resellers, clients, sponsors and advisors - all for goal directed behavior that even the most hardheaded manager can understand as valid and a potential competitive advantage.

For thoughts on extending SLATES technologies with permissioned access to internal and external stakeholders, see Why Can't a Business Work More Like the Web? (.pdf), and Flip Test 1971 | Email versus Journal

For more on Ted Nelson, see John Markoff's Jan 11, 2008 NY Times profile In Venting, a Computer Visionary Educates and Ted's own words in his newly published book Geeks Bearing Gifts: How the Computer World Got this Way.

Related

Traction Roots - Doug Engelbart

April 9, 2006 · · Posted by Greg Lloyd

Image
The source of the term Journal for the Traction TeamPage database is Douglas Engelbart's NLS system (later renamed Augment), which Doug developed in the 1960's as one of the first hypertext systems. Traction's time ordered database, entry + item ID addressing, and many Traction concepts were directly inspired by Doug's work. I'd also claim that Doug's Journal is the first blog - dating from 1969.

More importantly, Doug's aim has never been "content management" or some buzzword - it's been improving the performance of teams dealing with complex and challenging tasks - "raising their collective IQ". Augmenting human intelligence is a challenging and noble goal for social software.

In the late 1960's Doug created the Journal (along with the mouse, shared-screen interactive hypertext and video, dynamic outlining and many other inventions) to support the needs of high performance, problem solving teams.

Doug’s first hypertext Journaling systems were deployed as part of the original ARPANet Network Information Center (NIC), starting with ARPANet Node 3 at SRI - i.e. the third node on what we know as the Internet.

I’ve known and admired Doug’s work starting as an undergraduate Computer Science student using Andy van Dam and Ted Nelson's first hypertext system at Brown (1969). I had the privilege of meeting and working with Doug in the late 1980’s when he and Andy became members of Context Corporation's technical advisory board (Context was a commercial hypertext editing and publishing system with built in change tracking and early SGML support, used for aircraft maintenance manuals and similar applications).

Doug and Andy would visit Context in Portland OR every quarter for a three day meeting - on Context's plans, their advice, and their perspective on hypertext history and evolution. We also enjoyed meals and conversation. I'll always remember Doug's quiet and smiling manner as well as his incredible determination, deep understanding, moral commitment, and pioneering vision. He was and remains a hero to me.

My advice - if you want to invent the future of the web and social software, carefully read what Doug, Andy, Ted and Alan Kay have written. Their wikipedia bio's are a good starting point - I'll post a few favorite quotes here.

See the Doug Engelbart Foundation site (DougEngelbart.org) for Doug's current work, links to many of his papers, and November 2000 National Medal of Technology Award citation . A few of my favorite quotes:

In 1975 Doug wrote:

Our Journal system was conceived by this author in about 1966. I wanted an underlying operational process, for use by individuals and groups, that would help bring order into the time stream of the Augmented Knowledge workers. The term "journal" emerged early in the conceptualization process for two reasons:

  1. I felt it important in many dynamic operations to keep a log (sometimes termed a "journal") that chronicles events by means of a series of unchangeable entries (for instance, to log significant events while evolving a Plan, shaping up a project, trouble-shooting a large operation. or monitoring on-going operations). These entries would be preserved in original form, serving as the grist for later integration into more organized treatments.
  2. I also wanted something that would serve essentially the same recorded-dialogue purpose as I perceived a professional journal (plus library) to do.

Compcon 75 Digest, Sep 1975 pp 173-178, Douglas C. Engelbart THE NLS JOURNAL SYSTEM see the full paper, courtesy of the Doug Engelbart Institute.

In 1992 Doug wrote:

A result of this continuous knowledge process is a dynamically evolving knowledge base as shown in Figure-7 below, consisting of three primary knowledge domains: intelligence, dialog records, and knowledge products (in this example, the design and support documents for a complex product).

  • Intelligence Collection: An alert project group, whether classified as an A, B, or C Activity, always keeps a watchful eye on its external environment, actively surveying, ingesting, and interacting with it. The resulting intelligence is integrated with other project knowledge on an ongoing basis to identify problems, needs, and opportunities which might require attention or action.
  • Dialog Records: Responding effectively to needs and opportunities involves a high degree of coordination and dialog within and across project groups. This dialog, along with resulting decisions, is integrated with other project knowledge on a continuing basis.
  • Knowledge Product: The resulting plans provide a comprehensive picture of the project at hand, including proposals, specifications, descriptions, work breakdown structures, milestones, time lines, staffing, facility requirements, budgets, and so on. These documents, which are iteratively and collaboratively developed, represent the knowledge products of the project team, and constitute both the current project status and a roadmap for implementation and deployment. The CODIAK process is rarely a one-shot effort. Lessons learned, as well as intelligence and dialog, must be constantly analyzed, digested, and integrated into the knowledge products throughout the life cycle of the project.

Image Figure-7:: The CODIAK process -- collaborative, dynamic, continuous.

Figure 7 itemizes the evolving knowledge base within three categories: (1) Dialog Records: memos, status reports, meeting minutes, decision trails, design rationale, change requests, commentary, lessons learned, (2) External Intelligence: articles, books, reports, papers, conference proceedings, brochures, market surveys, industry trends, competition, supplier information, customer information, emerging technologies, new techniques (3) Knowledge Products: proposals, plans, budgets, legal contracts, milestones, time lines, design specs, product descriptions, test plans and results, open issues.

from 'Toward High-Performance Organizations: A Strategic Role for Groupware' Douglas C. Engelbart, Bootstrap Institute, June 1992 (AUGMENT,132811) see the full paper, courtesy of the Doug Engelbart Foundation

[ quoted from grl1427, Greg Lloyd's private TSI blog post of August 2002 ]

Update Remembering Doug Engelbart, 30 January 1925 - 2 July 2013

Related

Tricycles vs. Training Wheels

February 2, 2006 · · Posted by Greg Lloyd

In Infoworld, Jon Udell writes When it comes to increasing human productivity, user interfaces aren't one size fits all and cites Doug Engelbart:

"Easy-to-use computer systems, as we conventionally understand them, are not what Engelbart had in mind. You might be surprised to learn that he regards today’s one-size-fits-all GUI as a tragic outcome. That paradigm, he said in a talk at Accelerating Change 2004, has crippled our effort to augment human capability.

High-performance tasks require high-performance user interfaces specially designed for those tasks. Instead of making every task lie on the Procrustean bed of the standard GUI, we should be inventing new, task-appropriate interfaces. No, they won’t work for everyone. Yes, they’ll require effort to learn. But in every domain there are some experts who will invest that effort in order to achieve greater mastery. We need to do more to empower those people. ..."

I agree, and second Jon's suggestion that we all should look carefully at Doug's goals and analysis. Doug's consistent position since the 1960's has been: valuable skills that make people productive have a learning curve, but may provide the only means to effectively augment ones capabilities. Doug often uses the analogy: Riding a bicycle - unlike a tricycle - is a skill that requires a modest degree of practice (and a few spills), but the rider of a bicycle quickly outpaces the rider of a tricycle.

Alan Kay (the godfather of Smalltalk and the PARC interface) picked up on this theme during a talk and panel discussion at the MIT Bush Symposium: 50 Years After As We May Think, quotes below (watch the video it's great). Doug's hard core position on this point has likely been one of the factors which limited acceptance and adoption of Augment/NLS [Doug's 1968 hypertext system for which he invented the mouse and chord keyboard].

From today's perspective, Augment/NLS suffered from a learning wall rather than a learning curve. Thirty years ago, Augment/NLS users had to understand, learn, practice, and use an entirely new and unfamiliar set of paradigms for communication (and typing - if you wanted to use Doug's chord set along with the mouse). The rewards were great, but the steepness of the path required heroic dedication.

The great challenge is: finding an effective strategy to get people moving in a direction that delivers on the promise of Engelbart's Augment/NLS systems for highly skilled and dedicated teams - who reap the greatest benefit from a deep product - while making the entry point simple and clear to a naive or disinterested user.

Over thirty years later, we have the luxury of building on top of the experience and social as well as technological infrastructure of the web. We need to make the entry barriers are as invisible as we can, and make each step up the experience ramp deliver greater and greater return.

This makes interface design much more challenging, but worth the effort.

In the same panel, Alan Kay said: "I characterize what we have today as a wonderful bike with training wheels on that nobody knows they are on so nobody is trying to take them off".

I think it's better to build software with training wheels that are easy to recognize and remove, than to continue to build tricycles that no-one can grow out of. As Alan says, its a terrible mistake to assume that kids and and grownups will not spend the time to acquire new skills, so long as the payoff is great enough and mastery of the skill is itself a source of enjoyment. Mastery of Emacs can be just as enjoyable and rewarding as mastery of a video game.

Alan Kay: ... If you have ever seen anybody use NLS [Engelbart's 1968 hypertext system for which he invented the mouse and chord key set] it is really marvelous cause you're kindof flying along through the stuff several commands a second and there's a complete different sense of what it means to interact than you have today. I characterize what we have today as a wonderful bike with training wheels on that nobody knows they are on so nobody is trying to take them off. I just feel like we're way way behind where we could have been if it weren't for the way commercialization turned out.

Doug Engelbart: Well, strangely enough, I feel the same. It's part of the thing of the easy to learn and natural to use thing that became sortof a god to follow and the marketplace is driving it and its successful and you could market on that basis, but some of the diagrams pictures that I didn't quite get to the other day was how do you ever migrate from a tricycle to a bicycle because a bicycle is very unnatural and very hard to learn compared to a tricycle, and yet in society it has superseded all the tricycles for people over five years old. So the whole idea of high-performance knowledge work is yet to come up and be in the domain. Its still the orientation of automating what you used to do instead of moving to a whole new domain in which you are going to obviously going to learn quite a few new skills. And so you make analogies of suppose you wanted to move up to the ski slopes and be mobile on skis. Well, just visiting them for an afternoon is not going to do it. So, I'd love to have photographs of skateboards and skis and windsurfing and all of that to show you what people can really if they have a new way supplied by technology to be mobile in a different environment. None of that could be done if people insisted that it was an easy-to-learn thing. ...

Alan Kay: Looking back I think that one of the paradoxes is that we made a complete mistake when we were doing the interface at PARC because we assumed that the kids would need an easy interface because we were going to try and teach them to program and stuff like that, but in fact they are the ones who are willing to put hours into getting really expert at things - shooting baskets, learning to hit baseballs, learning to ride bikes, and now on video games. I have a four-year old nephew who is really incredible and he could use NLS [Engelbart's 1968 hypertext system] fantastically if it were available He would be flying through that stuff because his whole thing is to become part of the system he's interacting with and so if I had had that perspective I would have designed a completely different interface for the kids, one in which how you became expert was much more apparent than what I did. So I'm sorry for what I did.

quotes from: Brown/MIT Vannevar Bush Symposium - 50th Anniversary of As We May Think, Notes from the Panels, see the video.

Update Remembering Doug Engelbart, 30 January 1925 - 2 July 2013

See also Doug Engelbart | 85th Birthday Jan 30, 2010
And here's what Enterprise 2.0 looked like in 1968 | Dealing lightning with both hands...