Feeds:
Posts
Comments

Archive for the ‘Robots’ Category

The “collapse of time” was an important meme in the Techonomy 2019 session on Super-Evolution, the idea that startups can now harness rapid prototyping and vast pools of data to develop radically new business models quickly and at scale (video here)

Techonomy

Super-Evolution is about creating more – dramatically more – options. Invented by AI, aka non-human logic. (see also Haydn Shaughnessy on the importance of maximizing options and radical adjacencies vs. core competency in innovation)

“Leave behind the myth of the grand plan and create the conditions for optionality and just-in-time strategy.”(Haydn Shaughnessy)

The first time I felt that sensation of collapsing time was when viewing Elon Musk’s Tesla 2019 update. I felt beaten by algorithms. The Tesla is now/then learning from (data) from human behavior and driving like a human, but ultimately will EXCEED their behavior” (at 01:48:15)

There you have it: gradually, but suddenly we have a singularity. Gradually but suddenly, all jobs are doomed. We are not going to stop this with an ethics council or with regulation. The train has left the station, the genie is out of the bottle.

“The fleet wakes up with an over the air update”

PR or product? The same question was asked some months later by Jean-Louis Gassée regarding the Cybertruck launch:

“Elon Musk forces us to be of two minds. On one side, we have Musk the Mountebank; on the other, a Captain of Industry.

I had the same feeling of time-space collapse and irrelevance when watching this awesome interview with Rahul Sonnad, CEO/Co-Founder of Tesloop, explaining how “Robo-Mobility is a hospitality service” and “Once cars are appliances”

Are we toast? And/or do we need to reboot, reskill, etc if we don’t want to become irrelevant? Venkatesh Rao gives his perspective when reflecting on Inventing Time, and playing on Alan Kay’s “It is easier to invent the future than to predict it” and William Gibson’s “The future is already here, it is just unevenly distributed.”

“Riding in a Tesla made the electric vehicle future seem utterly inevitable in a way that kinda killed the present for me. Suddenly I could no longer look at gasoline cars the same way. Driving in my own car felt different like I was stuck in the past, waiting for the price of the future to come down to the point where I could afford to live in it. So a Tesla creates the future in the sense of both the Alan Kay and William Gibson quotes. It makes the future real in a deep way that is like making time itself real. And you know this because the feel of the present feels different like you’re heading down a dead-end, a lame-duck future. You’ll have to either abandon it as soon as you can or end up dying with it.

Maps book

Around the same time, I was lurking in Simon Ferdinand’s Mapping Beyond Measure: Art, Cartography, and the Space of Global Modernity. He could have added the Time of Global Modernity, as he writes about spatial (spheres) and temporal (time collapse) ruptures.

“Often map artworks recapitulate the narratives of rupture (spatial as well as temporal) through which global modernity differentiates itself from inherited pasts and surroundings.

And;

“Maps have proven integral… to the experience of “time-space compression”

Greenaway

It made me think of Peter Greenaway’s film ‘A walk through H: The Reincarnation of an Ornithologist’ (1978) and “A Walk through a Thousand Plateaus”, an homage to that film.

It is probably a sign of the times that in the preparation of his new book “Agency” also the great William Gibson lost a sense of how weird the world has become, up to the point of the present bypassing his future sci-fi scripts – “His future had to catch up with the present”and “stubs”: alternative timeline in which technologists (and, more tellingly, hobbyists) of the future are able to meddle.

Agency

Hobbyists and meddling, the right words probably for not getting alienated. I would call it “tinkering” by maximizing options that human logic not necessary can spot or generate in time.

petervan-signature

 

Read Full Post »

The theme for Techonomy 2019 in Half Moon Bay, California was “Reset and Restore: Governing Tech, Retrieving Ethics, and Acting on Climate.”

Keen and David

In the opening session, Founder and Host David Kirkpatrick prompted: “These are serious times” and the following interview by Andrew Keen of David was really interesting. Keen rightfully asked the question of what needs to be reset, and – if we have to restore something – is this a nostalgic going back to good old times, or what is meant here?

To make a long story short, it seemed the answer could be distilled to a resetting and restoring back to/towards more humanity.

Konstantinos Karachalios, Managing Director of IEEE’s Digital Ethics department referred to the German Jewish Viennese philosopher Gunther Anders, who wrote in 1956 “The outdatedness of the Human Species”.

Konstantinos also shared some strong opinions about the Power (in)equation – the asymmetry in power of the big tech vs. us – and summarized his thinking as “The Time of (Engineering) Innocence is Over”

Colin Parris @colin_j_paris did a session titled “Why AI has to be humble” about GE’s use of self-learning AI in the building of GE Jet Engines. Super-slick and professional presentation, almost too clinical. The last slide was about “Intimidation by Immortal Machines”.

Immortal machines

My head got spinning and got me thinking of John Markoff’s 2015 book “Machines of Loving Grace – The Quest for Common Ground between Humans and Machines

Markoff

In itself, the book’s title is a spin on Richard Brautigan’s “All Watched Over by Machines of Loving Grace” from 1967, and of course, Adam Curtis fantastic 2011 documentary “All Watched Over by Machines of Loving Grace

 

I like to think (it has to be!) of a cybernetic ecology

where we are free of our labors

and joined back to nature,

returned to our mammal brothers and sisters,

and all watched over by machines of loving grace. 

Richard Brautigan, “All Watched Over by Machines of Loving Grace” © 1967

Let me put all this behind the backdrop of what I saw and experienced a couple of days earlier in the San Francisco Museum of Modern Art (SFMOMA).

Moss screen

Richard Moss "INCOMING" - Picture by Petervan

On the 7th floor, there is an amazing video installation by Richard Mosse, called “INCOMING”, and it is about the horrible conditions in another Western export product: refugee camps, and related issues of sovereignty, warfare, and surveillance.  The installation forces us to confront our own complicity. Strongly recommended. Still running in SFMOMA till 17 Feb 2020. Warning: you won’t come out smiling from this installation!

See also interview with the artist in Forensic Architecture

The entrance of the installation also includes a picture of Berlin’s Tempelhof, a symbolically loaded site to house asylum seekers.

Temperhof

Tempelhof context

“…, and the airfield has been transformed into a popular public park. Some of its adjacent buildings and territory were designated as an emergency refugee shelter in 2015”

What misery! What a shame for a “modern” society! This installation made me rethink my opinion about refugees. For me, it questions the whole semantic discussion about “asylum seekers” vs. “economic” refugees. There is no difference. When people become so desperate to flee their home and take these incredible risks and withstand these inhumane circumstances, those semantics become irrelevant.

This injustice is going to explode in our face, sooner or later. A toxic mix with climate change, inequality and the 1% owning 99% of the wealth. I can only hope I will not be treated this way when I or my children have to find refuge for climate change or other disasters in the future.

All the big problems of today are crying for more compassion, more morality, less greed. The root cause is a lack of morals combined with an abundance of greed.

Putting it all together, “Immortal Machines of Loving Grace” may be better replaced by “Immoral Machines of Loving Greed”.  Just replacing two words is probably better and more adequately describing our Zeitgeist.

In that sense, some of the discussions of Techonomy 2019 should have included the refugee crisis vs. having safe conversations about the attention economy, tech supremacy or immortal machines of loving grace in a five-star luxury hotel.

See also my separate post on the key memes of Techonomy 2019.

petervan-signature_transparent_black_version2

 

Read Full Post »

This is a short (and bit weird) morsel on not understanding a clue anymore, to have the feeling to encounter a completely foreign world.

I happened to me several times last months, that I read or meet something/somebody and I really don’t have a clue what they are talking about.

  • A friend shares with me her business plan for a new app, and I have no clue what it is about, even not after having (tried to) read the associated white paper
  • The book “What Algorithms Want” by Ed Finn
  • The “God is in the Machine” post by Carl Miller
  • The 1000 dimensions of algorithms in James Bridle’s “New Dark Age
  • Eddie Harran’s (aka Dr.Time) Temporal Labs, Research lab investigating time’s impact on humanity

From the “God in the machine” post:

We sat there, looking at the computer, his creation laid out in multi-coloured type. “This is all to do with complexity,” he said contemplatively. “Complexity of input. Complexity of analysis. Complexity of how outputs are combined, structured and used.” 

 “Truth is dead,” he sighed. “There is only output.”

 After some 1-1 conversations with some of the authors, it looks like I missed a whole generation of aesthetic language that is only found in apps, games, and Netflix-ish series like Black Mirror, Mr. Robot, Tangerine, Ratter, and Skam.

black mirror

Black Mirror – Season 4 – 2017

It feels like digital incest. Trying to hide from your virtual self. A virtual loop of digital identities and personalities. Not knowing what is real and what is fake or sliced/looped faith.

It also makes me think of this extract from Bill Gates’ review of Capitalism without Capital:

It took time for the investment world to embrace companies built on intangible assets. When we were preparing to take Microsoft public in 1986, I felt like I was explaining something completely foreign to people. Our pitch involved a different way of looking at assets than our option holders were used to. They couldn’t imagine what returns we would generate over the long term.

It feels like I cannot imagine what these new aesthetics can mean on the long term, and how they are already influencing now Generations X, Y, and Z.

I am missing the @swardley’s situational awareness map, about movement and position. Where is the anchor? What is edge and what is commodity?

Visit Roger Raveel museum 28 Sep 2018

If you are still in for it, here are two soundscapes of my visit to the Roger Raveel Museum;

Still with me? Where am I? What’s next? Where is this going? How fast? How? When? With whom? Who is cheating? Who’s not?

Are we entering a digital matrix? Where real and surreal blur into an new perception?

Tell me if your understand.

Are we all lost?

petervan-signature

Read Full Post »

We use models and metaphors to make sense of our organisational structures, understand them, make predictions, apply change.

Blog_beehive

Bee hive - via Bridging the Gap

Some well known models are:

  • Ants in colonies
  • Bees in hives
  • Apes in jungles
  • Humans in neural networks
  • Organisations as machines
  • Hierarchies, wierarchies, holocracies

Models are not reality. Models are an abstraction of reality. Same for metaphors. They help us tell and understand a narrative.

We are not apes, ants, or bees. We are humans. As Jonathan Haidt explains at length in his book “The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom”, I am struck by all the noise humans put on the system: “We are all hypocrites” and “We are the rider (the conscious/the ratio) ànd the elephant (the unconscious, feelings, instincts, genes). Most models assume the rider is in charge. The rider is not in charge.”

Structural change leads to structural behaviour change. Structural change needs high quality connections and flows.

“A high quality connection is one where information transfer is rapid, reliable, and noise free” says Tom LaForge.

But in real life, this information transfer is NOT noise free. Maybe in some nirvana love relation, but usually not at/for/within work.

Noise comes from the motivations of the elephant (the unconscious), some examples:

  • Reciprocity
  • Prestige
  • Self serving biases
  • Power
  • Hypocrisy
  • Arrogance and entitlement

In most re-orgs, people look at the motivations and incentives for the ratio, the rider. They ignore the elephant. They forget the rider is not in charge.

High quality connections need something else than speed, reliability of noise-freedom.

There should be some dimension/ambition/alignment of “Spiritual, moral and aesthetical advancement”.

In this category, we find standards and appreciation for:

  • Care
  • Tradition
  • Craftsmanship
  • Beauty
  • Proportion
  • Sacredness
  • Infinite games

See also my own post about Kevin Kelly’s qualities created at the transaction, which is more about qualities of resulting products and services than qualities of structure: https://petervan.wordpress.com/2017/04/19/sine-parole-19-apr-2017/

And then there is governance

hierarchies

Simple Google search on organisational hierarchy

The simplicity of the hierarchy works well on a slide or a hand-out. You can document it in a spreadsheet, or box-diagram and so on. But all these representations do is framing the conversation in an illusion of simplistic 2-dimensional structures. It’s the specialty of management consultants to think and present in two dimensions. It’s making it easy for executives to understand.

But if you are used to a 3-dimensional view of reality, you can’t understand why the flatlanders don’t see what you see. As long as you are primed in 2D you won’t see what the other dimension sees.

A better picture/metaphor for an organisational structure would be something like this.

escher

Relativity – 1953 Lithograph by M.C. Escher – 294mm x 282mm

Ricardo_Bofill_Taller_de_Arquitectura_Barcelona_Spain_The_Gardens_24-1440x968

Ricardo Bofill – La Fabrica – Old cement factory – Barcelona, Spain

It’s messy. At many moments you don’t know anymore where you stand. The perspective changes all the time. You get disoriented.

There is somewhere a general definition for Robots:

Robot = sensors + mind/computer/algorithm + body (hardware).

But humans are not just: senses + brains + body.

Computers are not like brains. Brains are not like computers. Our human models are different from machine models. Machine understanding is different from human understanding.

Humans are not just nodes on a network/grid that can be governed by coded social contracts, blockchains and AI. If you do that, humans are just cogs in another machine. Humans become cogs in a network.

The obvious case is of course Uber, which is an economy of extracting value vs. the so-called sharing economy. For Uber, all the drivers are already cogs in a network for the sole benefit of the monopoly.

Being cogs in networks is an insult for humans. But we are just getting started:

But does it still matter at all these days? We already are in a new world of “Alien knowledge, when machines justify knowledge”. Check out this fantastic long read by David Weinberger

Alien Knowledge

Via David Weinberger - Illustrations by Todd Proctor / YouWorkForThem

“The paradigmatic failures seem to be ones in which the machine justification has not escaped its human origins enough.”

Organisations are not models/buildings/boxes. They are like rivers with information flows. Building skeletons, where the structure of the building guides traffic and connections.

David Weinberger talks about models created by machines. Models that machines can understand and we don’t. It is very much as he concludes:

“It has taken a network of machines that we ourselves created to let us see that we are the aliens.”

If we don’t want to end up as cogs in networks, we need to aim for structural advancement at a spiritual, moral, and aesthetical dimension.

petervan-signature_transparent_black_version2

 

I am in the business of cultivating high quality connections and flows to create immersive learning experiences and structural change. Check out: https://petervanproductions.com/

 

Read Full Post »

project-x

Project X building, lower Manhattan at 33 Thomas Street, NYC

This week, The Intercept ran a fascinating article– well, quite disturbing actually – describing the possible surveillance roles of building Project-X, an AT&T owned property in the middle of Manhattan, sitting on top of some major telephone and communication switches (and apparently many other buildings like this in the USA and most probably elsewhere).

It even becomes super scary if you read the article with the backdrop of the first names and background of some of the people appointed in the last couple of days in the Trump administration.

The building was designed by the architectural firm John Carl Warnecke & Associates, whose grand vision was to create a communication nerve center like a “20th century fortress, with spears and arrows replaced by protons and neutrons laying quiet siege to an army of machines within.”

Some of Warnecke’s original architectural drawings for 33 Thomas Street are labeled “Project X.” It was alternatively referred to as the Broadway Building. His plans describe the structure as “a skyscraper to be inhabited by machines” and say that it was “designed to house long lines telephone equipment and to protect it and its operating personnel in the event of atomic attack.”

I spotted the article just two days after I saw a short 7-minute documentary (hence Doc7) on Belgian television about artist Renato Nicolodi

renato1

Renato Nicolodi – a young artist from Flanders – makes architectural models of buildings that are not intended to be built.

renato2

Pulpitum II by Renato Nicolodi, 2012

long-lines-building-nyc

 

Long Island Building NYC, by John Carl Warnecke & Associates

That made me think about my time as student in Architecture in Ghent, were we were allowed – or should I say incentivised? – to design buildings that never had to be built (at least in the first two years of the study). Full creativity nirvana, quoi.

The work of Nicolodi resonated with me for another reason. They are actually mausoleums that have a place in the memories of his grandfather, who spent the Second World War in various prisoner of war camps, which he meticulously describes in the conversations Renato argued with him. The recordings of those calls still are daily source of inspiration for Renato.

It woke up old memories from my youth when – at the age of 6 or 10 – I was visiting my grandmother, who lived in a place called Ledegem, a little village 17km east of Ieper, a town that will be remembered forever for the first time use of poison gas in World War One.

It makes me wonder about the working and selectiveness of my memory. Since I started my sabbatical begin Nov 2016, I feel restless.

Being disconnected from work – “the job” – gives me plenty of space for reflection, experimentation, silence, being alone – I love the sound of silence of the morning-house before the rest of the family has woken up.

But this stillness also seems to bring back many old memories, going way back to my childhood, things that I never thought about anymore in the last 50 years. On the other hand, it seems my short-term memory is getting very selective – almost ignoring mode. Up to a point that my lovely wife sometimes wonders if I should not go and see a doctor, but I think I am doing fine.

ledeghemmc

Ledegem WWI cemetery today in 2016

At the end of my grandmother’s garden was a cemetery holding 85 Commonwealth burials and commemorations of the First World War. I remember playing on the walls and the crucifix of the cemetery. In my memory, the place was much bigger than in this recent picture. I also remember some of the bunkers that you still find here and there scattered throughout the landscape in this region. I remember playing in one at the seaside before they were closed off for general public. I remember the smell of wet sand.

german-ww1-command-bunker-ypres-salient

German WW1 Command Bunker, Ypres Salient

The memory also put me in contact with another aspect of my onlyness (I am currently reading the draft manuscript of Nilofer Merchant’s next book), where I am coming from. My father is from a family of 7 kids, that all needed to be to taken care of by my grandmother all alone, as her husband died in a tragic car accident (he was on a bike) just before the start of the second world war. So, it was surviving on a shoestring.

Deep in my (un)consciousness, there is the fear for this shoestring poverty. That we’ll have to hide again in the coldness and humidity of bunkers in the polders. A dystopian threat of dark secrecy, manipulation, corruption and a fundamental loss of trust.

That is what bunkers and secret buildings do to me. Even if they are just architectural models that are not intended to be build.

The new models don’t seem to be intended for humans, they are intended to host machines. How can we reclaim back our humanity?

Read Full Post »

At this year’s Innotribe Sibos, we have a session about digital ethics. Part of a full day on man-machine convergence.

Some of that conversation will be about the use and control of data. With this post, I would like to add my perspective to that conversation, based on some recent thinking on human agency.

At a recent MyData2016 event in Helsinki, i was surprised how little the thinking about personal data stores has evolved since 2012, when i was myself deeply in the trenches of the topic of distributed data sharing.

It was a really great conference, well organized, cool audience etc, but like many conferences, it was the tribe talking to tribe, believers talking to believers, all thinking that their lens to look at things was the right one, with little or no contrarian view.

I wanted to be that contrarian, and challenge a bit the assumptions.

At the event there was a lot of talk about “PIMS”: Personal Information Management Systems, or personal data stores, or personal data “clouds”. I don’t want to have a discussion about the subtle semantics here.

At one moment, Jamie Smith from Ctrl-Shift – who i respect a lot – said something along the lines of “PIMS are all about giving people agency”.

I think that is a big illusion, and that was what my talk was about. The illusion that the problem is about taking back ownership and control of your data. And that a PIMS is the solution. I believe we are discussing the wrong problem and the wrong solution when talking about managing our own personal data at our terms and conditions.

Owning your own agency is more important than owning your data. That in essence is what my talk was about.

My presentation at #MyData2016 conference

UPDATE: here the link to the Prezi of this presentation. Because there is so much video in this Prezi it takes 2-3 min to load. Be patient 😉

The talk is part of a longer story of more than one hour, wandering through a whole bunch of philosophical, ethical and artistic considerations. At this event, i got only 20 minutes, and i told the moderator he could cut me off, which he did most elegantly (no pun intended) at the end of my presentation.

My agency vs. my data is a pretty big deal.

  • It is not about buying but creating
  • It is not about my data but my agency
  • It is not about privacy but about shelter
  • It is not about power asymmetries but relationship symmetries
  • It is not about MyData but about OurData

In that sense the GDPR (General Data Protection Regulation) is shooting at the wrong problem. In that sense our politicians and leaders in general are again outperforming in solving the problems of the past.

I got some good reactions after this talk, from Doc Searls saying “you gave the talk that i always wanted to give”, to somebody else sending me a tweet and a mail saying “your presentation has changed my life, i decided to leave Facebook after more than 10 years”.

There is such a strong tension between our actual reality and the desired reality that we are currently moving in some form of virtual or surreality. But as Magritte said:

“Surrealism is the immediate knowledge of reality”

And we feel lost. We escape and try to reconnect nostalgically to what was, and are afraid of what going to be. People focus on the surreality of their phones instead of real life.

People believe what is on their phones and PIMS is the reality, and are able to represent us as human beings. But as Markus Sabadello said at this event: “Technology will not be able to represent the full complexity of human beings”

Our devices and apps make us believe we are in control, because we now can “manage” our data and lives. But we are focused on managing life, rather than living it. That is our big illusion.

To summarise, I believe our plan and ambition towards our desired reality must at least have following components:

  • This space needs to be regulated. Regulation means setting ethical and moral norms, AND policing them
  • These norms must be ethical and moral
  • We must decide who sets these norms, who polices them, and who penalises/rewards good behaviour.

For that we must bring “Society-in-the-loop”, and not let this be decided by governments, corporations, or god forbid, algorithms

 

society-in-the-loop-iyad-rahwan

Society-in-the-loop by Iyad Rahwan

We must expand ourselves from a problem (efficiency) orientation to a creative (value creating) orientation, because the future is not about solving the past but knowing what you want and use mastery to make that happen.

Last but not least, we must be very much aware of the shallowness of the actual reality, and strive for high quality work with high quality attention and presence and meaning also called “Deep Work”

Maybe next year, they should call the conference #MyAgency2016;

Read Full Post »

Web

Artificial intelligence. Cognitive computing. The Singularity. Digital obesity. Printed food. The Internet of Things. The death of privacy. The end of work-as-we-know-it, and radical longevity: The imminent clash between technology and humanity is already rushing towards us. What moral values are you prepared to stand up for—before being human alters its meaning forever?

This is not me saying this. This is Gerd Leonhard a new kind of futurist schooled in the humanities as much as in technology. A musician by origin, Gerd connects left and right brains for a 360-degree coverage of the multiple futures that present themselves at any one time. In 2015, Wired Magazine listed Gerd as one of the top 100 most influential people in Europe.

In his most provocative book to date “Technology vs. Humanity: The coming clash between man and machine” (Amazon Affiliated link), he explores the exponential changes swamping our societies, providing rich insights and deep wisdom for business leaders, professionals and anyone with decisions to make in this new era.

If you take being human for granted, check-out this trailer for a movie he made with Jean-François Cardella, his film producer.

 

 

Gerd has a new book out and it is and i recommend it strongly, and i am not alone.

 

“Gerd Leonhard is most definitely a member of Team Human. Here’s his convincing and heartfelt call for the reinstatement of people and purpose into the technology program.” – Douglas Rushkoff, Author of ‘Throwing Rocks at the Google Bus’, host of the ‘TeamHuman’ podcast

“Gerd Leonhard provides a fascinating look at the impact of exponential technologies and the dilemmas we will face in adapting to—or being adapted by—these. His book really makes you worry—and think.” – Vivek Wadhwa, Academic, Researcher, Writer, and Entrepreneur.

 

A good overview of the book can be found in Forbes’ recent interview with Gerd Leonhard and his reflections on digital ethics:

“Like sustainability, ethics is often thought of as a nice to have, a thing to consider when you have time, a luxury, non-monetizable. But now it is becoming clear that those distinctly human things that are not measurable (I call them the “androrithms” – as opposed to algorithms) such as emotions, intuition, beliefs and ethics are what sets us apart from machines.”

Gerd’s thinking is of great relevance to financial services. Because the whole value proposition of the financial services industry is about to change, it needs to reinvent itself in order to discover and grow new values and revenue streams.

 

Gerd_illustrations_27_5_16_v3

 

“In general you can say the financial industry has been asleep at the wheel for the past ten years, but it has woken up with a start,” says Leonhard, and

“The Darwinian megashifts of exponential technologies eventually challenge most of our assumptions, meaning somebody is going to reinvent the way we think about stock markets and what a stock-market actually is. After we get the blockchain and a global digital currency, the next step is to revamp the entire logic of the stock market. And that is imminent.”

In addition of the book and the film, Gerd has created a unique experience called The Future Show Live. The Future Show Live will demonstrate what exponential technologies are doing to our world of business and society and will create a context around financial services, pointing people towards how they can innovate from inside an organisation and not rest on outmoded systems.

We will need to embrace technology – but not become it. We will need to find ways that technology will actually serve humanity (i.e. support human flourishing and contentment) not vice versa.

Gerd Leonhard will be hosting The Future Show Live at Sibos at the Innotribe stand next to the main Sibos stand on Wednesday, 28th September from 9:30-10:15am.

55x19copy  All illustrations are by Gerd Leonhard and are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

 

Read Full Post »

Older Posts »

%d bloggers like this: