Blog posts

About Data Lifetime (2013)

Published on:

By Sylvain Lesage

Back in 2013, while I was working in the GeoBolivia team, we had the chance to meet in La Paz with Arnulf Bichler and Athina Trakas who were respectively working for OSGeo and OGC at that time. We benefited a lot from their guidance on open data, geospatial standards and free software.

About that time, Arnulf published a blog post that influenced a lot my thinking about data and software. Well, the idea was simple, but as my thinking is limited too… perfect match.

So, there is the blog post: “About Data Lifetime”.

Two quotes, that resonate particularly in the AI era:

He wants a one-off development that will do just exactly what he needs for two or three years. Then he will throw it away and get something new. It is always a one-off, never more.

Your software will go away. Your data is going to stay.

I still think that software is basically trash, or at least disposable at short or medium term, and that the value lies in the data and the metadata. Collecting the data, organizing it, transforming it in useful formats or shapes, making it available, archiving and preserving it, are tasks I find harder to automate than software development, and that generate more impact.

I love coding, and I don’t love filling metadata (or tell people to do it). But as an independent consultant, in 2026, I think it makes more sense to focus on data than software.

I recently asked advice on LinkedIn, Reddit and Mastodon to understand why clients would find value in hiring an experienced developer like me in the AI era. The answers are less depressing than I expected, but also confort me in thinking I have to adapt my offering. I better focus on 1. shaping complex software projects, 2. ensuring quality (prototype -> production-ready), 3. helping clients to understand, organize and get value from their data.