Interview with Tabea Lurk and Jürgen Enge

Brussels, April 5, 2011

 

Tabea Lurk and Jürgen Enge are both teachers in the Hochschule der Künste Bern in Switzerland and in the Staatliche Hochschule für Gestaltung Karlsruhe in Germany. They work together on the preservation of digital art since 2006 in the framework of projects such as AktiveArchive. Together, they have developed strategies and concepts to document and preserve computer-based artworks facing the obsolescence of their hardware and software components, through amongst others tools, their Netart Router and the use of virtual machines. Emanuel Lorrain (PACKED vzw) met them to talk about the routines they use and how they envision the future of computer-based art preservation.

 

PACKED: Could you tell us how you both started working on computer based-art preservation?

Tabea Lurk: I started in 2006 with the AktiveArchive-Project1, when I was hired to develop strategies for the preservation of computer based art and especially internet-based art. That was after I finished my volunteering (Volontariat) at the ZKM Karlsruhe2. I have had several student jobs since 1999 there. Even though I wasn't working on preserving the artworks, I was in contact with computer-based art and interactive installations.

As an art historian, you can get close to the technical questions, but there is a certain point, when you don't have enough technological knowledge to have the deep understanding necessary for preserving such artworks and their technological background. That was the point when Jürgen joined me, because he is a computer scientist.

Jürgen Enge: I studied computer science at the University of Karlsruhe and the first time I came across computer-based art was in 1997, when the ZKM-Centre was opened. At that time, I worked at the ZKM Media Lab and was in contact with several artists, helping them build up their artworks. In the following years, I worked at the Medienmuseum3 and “repaired” artworks. After the ZKM, I went to the Zurich University Arts where I gave a full course on Mobile Applications Design. Then, I went back to the ZKM where I became for two years the head of the Institut für Netzentwicklung4, the realm in the ZKM that dealt specifically with internet-related questions, websites, etc., for two years.

After those two years, I came to the Karlsruhe University of Art and Design (HfG Karlsruhe) where I worked on projects that tried to bring together European Media Art Institutions and their Collections, especially video art: the first project I applied with Woody Vasulka was OASIS – Open Archiving System with Internet Sharing5 (headed by HfG Karlsruhe), followed by GAMA – Gateway to Archives of Media Art.6 That was when I started to work with Tabea on the development of routines for conservation, documentation structures, etc. – at that time she was still related to AktiveArchive.

 

PACKED: So you thought it was important to bring your respective skills together?

Tabea Lurk: Yes, because even though as an art historian I always try to analyse the work, the code lines and the programming, to me it feels that I'm staying on the surface level. It is much better to collaborate with a computer scientist as Jürgen, who has experience with media art and can define what components the technical core of the artwork is made of.

Jürgen Enge: I am much more on the technical side of components identification while Tabea deals with the semantic part.

 

PACKED: How do you approach an artwork at first?

Tabea Lurk: Normally, we make a security copy first of all. Being able to make as many copies as you want is one of the huge advantages in working with digital material. You can really test several aspects without risking damaging the artwork. When I go through the artwork, I try to document the code and get a feeling of what the core of the artwork is based on. I ask myself how it function on a profound level? And then what is required to preserve this core. So I first analyse the work, and then Jürgen is there to explain the technological aspects of it.

Sometimes a work can be preserved in variable ways, but what we are interested in doing, is finding out, how to keep the authenticity of the work. What is the artwork’s meaning, what is “the original” etc.? A technological understanding is very important for such a process. It makes the preservation work easier and better, because once the technological background is clear, then the discussion can start. How can the work be documented? What strategies should be employed? Our two complementary views on the artwork are required, because when you work for yourself, you don't have a possible correction. This dialogue is important just to check if you are maintaining the correct path.

 

PACKED: In your writings you distinguish two main areas of the artwork: the “work relevant components” and the “environmental elements”. In order to determine themes you have developed the concept that you call the “work logic”. Could you explain these distinctive parts and terms a little more?

Tabea Lurk: What we try to do is to determine what the 'core' of the artwork is and then, like with an onion, get progressively to the outer 'skins' to find out what parts have a relevant component for making the work run, or what kind of software-libraries are relevant, etc. Then if you go to another skin of the onion, you start to get to the operating system, which is often less relevant. These differentiations are important because they allow you to change or modify things in order to sustain the work long term.

Jürgen Enge: For us, there are two possible points of view, from where you can look at the 'work logic' of an artwork. They constitute two semantic levels: one is the technological part and the other is related to the historical and conceptual part, that we could also call the “level of meaning”. The technological level is where you define for instance the pieces of software that don't belong to the operating system and are part of the artwork. Then, you can start to analyse how these specific pieces of software (of the artwork) interfere with the rest. On the semantic level you qualify the object-based parts of this artwork: you look at the computer as an object of hardware and software etc. and you ask which parts have been touched by the artists. This represents more the art historical point of view.

 

PACKED: Could you give us an example of what you consider a “core”?

Tabea Lurk: Two case studies we've been working on recently can illustrate these two levels or approaches of the “work logic”. Liquid Perceptron (2000) and Schnur (2004/2010). Liquid Perceptron is an interactive installation by Hans H. Diebner7, where the visitor enters a room in which there is a kind of close circuit, full size video projection. The visitor observes himself/herself in the video, without it being a clear representation of him, but movements that are translated into waves. The space filmed is reproduced on the screen based on the theory and algorithms of neuronal networks. And as a whole the installation represents a kind of interference of a neuronal network.

 

Installation view of Liquid Perceptron during the <i>Einstein on the Beach</i> event in Berlin, 2005.

 

The important part of this artwork is that the code is based on a physical formula, so that a piece of very specific scientific knowledge is embedded in the artwork. In this case, it is very important that the code remains untouched, as is a (scientific) formula. The question of the installation's appearance and look is not of primary interest – even though it does follow for a certain style according to the aesthetic ideas of the time as well as the artist's “taste”. For the artist the fact that the neuronal networks concept arises is more important than how the work will look and behave in the future. This work reveals a theory embedded in a certain manner, so that we have a very clear understanding of the “core”. There exists a very simple code, as programmed by the artist. However, this core needs software libraries which are related to the artwork, but that are no longer the core itself. The artist did not program them; they are specific Linux libraries, which enable the video to be captured into the system.

 

PACKED: This is a case in which the museum obtained the source code from the artist when the work was acquired, and this is not a typically widespread situation.

Tabea Lurk: Yes. In the other case study, Schnur by Jan Voellmy8, the situation was different, because it is a Flash movie and we just had the executable file. Normally, the source files are needed in order to adjust an artwork piece to a new context. In this work, you see a piece of (orange or) blue tape applied to a wall and a projected image of a cord - as if a white cord was fixed to the wall with the tape. On a technological level, the basic concept is that every time someone speaks or makes a comment, the cord moves based on the audio feedback. The work is very simple and has a clear concept, but it is much more difficult to sustain than Liquid Perceptron.

In this context, even the operating system becomes “work relevant”, because the Flash Player is really required to display the artwork. Then the semantic questions arise: ”What does Flash mean in this context? How important was it for the artist to use this software?” etc. Very often, the use of particular software is based on the context and on what happened to be accessible to the artist at that moment.

 

Installation plan of the <i>Schnur</i>. Courtesy of Jan Voellmy.

 

PACKED: Does the “onion” approach allow you to determine what works will be more complex to preserve?

Jürgen Enge: This analysis of the core of an artwork has nothing to do with complexity, but it does give us a hint as to where we should make more efforts during the preservation process. When the core is identified, we try to resist modifying it. If we take the example of a component connected to the core, like a video grabber, we can determine that the “core” needs it technically. But what is important is to have the correct interface to deliver, for instance a video frame into the artwork – as in the case of Liquid Perceptron. Therefore, we can do more modifications in this part of the related software, as long as we take care that the interface does not change in the core itself – or have a clearer idea of the parts about the artist has written his/her concepts in a formula for instance. By identifying the core, we can get an idea of where to concentrate our efforts.

Tabea Lurk: Yes, I think such a work-logic, with its classifications, gives an idea of how complex the work is in terms of how difficult it is to preserve it and of how the components relate to each other.

 

PACKED: Is the documentation phase also where the “work logic” is defined?

Tabea Lurk: Yes, the documentation is really a working tool. The purpose is not just to document the artwork so that the documentation can be available later and delivered or communicated. The documentation also helps one get a close look at the artwork so that one can identify its different components. The documentation process gives one a feeling of what the next steps and possibilities are for the artwork.

 

PACKED: In your writings you often talk about the “encapsulation of an artwork”. What does the term encapsulation mean in relationship to the preservation of a computer-based art piece?

Tabea Lurk: In order to guarantee that the core of the artwork remains untouched, you need to create some kind of “carrier environment” or capsule, in which you keep it. This functional unit is what we call encapsulation. The artwork is what is inside the capsule: the core and the environment. The outer surfaces of this encapsulated environment can then be migrated. Encapsulation is basically an enhancement of the idea of emulation. In her overview of different strategies for preservation, Lee, Kyong-Ho9 used the same kind of terms to give this feeling that you create a capsule or bubble, which is then sustained in order not to touch the artwork's nature any further. Encapsulation touches on the philosophical battle between migration and emulation, and we don't want to answer that question by saying that there is only one good solution.

Jürgen Enge: From a technical point of view encapsulation can use very different technologies. One possibility, is emulation, another is virtualization. But there can also be other possibilities, like libraries emulating other libraries. For example, you can use Unix programs on a Windows system if you just have the correct emulation library, but this does not emulate the hardware, it does emulate the software environment, which is completely different from emulating hardware environments. Encapsulation can mean just about anything that can provide a software layer between the artwork and, for instance the runtime environment10, the operating system or the hardware.

Tabea Lurk: In encapsulation, the capsule is not always complete. For example in the case of virtualization, you have interferences between the virtual and the real machine (hardware), which again causes a kind of dependency. If you use the term software layer as something that can in some way act as a translator between the artwork and what is outside the capsule, then it becomes clearer. The basic idea behind encapsulation is to get rid of the hardware dependencies in order to avoid issues with the obsolescence of certain equipment. However, it is clear that one can never really get rid of all of them.

Jürgen Enge: In information science there exists so-called “design patterns11” of software architecture. A proxy12 or a bridge13 equals some of these patterns that could provide the software layer needed for encapsulation. By looking closely at the design patterns in information science many predefined strategies could be employed.

 

 Analytical schema of the original system architecture of Hans Diebner's Liquid Perceptron. Courtesy of Juergen Enge.

 

PACKED: Could we say that “encapsulation” is only taking a portion of the onion, and that the software layer is placed between two “skins” of that onion?

Tabea Lurk: Yes. Then it is just a matter of how the interface behaves, if you have, for instance, a feed-in signal with a video camera or some audio feed. Very often, pieces using video have a specific graphics card that creates these kinds of dependencies. This is why we try to have a clear definition of what is outside the capsule and what is inside.

Jürgen Enge: For the future we will therefore try to connect all hardware interfaces to a network. There are some general interfaces on a computer and we try to determine which is more stable than another. An example could be the interface for a TCP-IP stack14, which is much more stable that of a special USB device. A ten euro webcam is not as stable as a networked camera; the networked camera can be emulated much easily than a USB camera.

One could also say that Open GL Graphic is much more stable than Direct X Graphix, because Open GL is open source and can be emulated by software much better on new operating systems than Direct X versions on a new Windows System. We feel that it is not a very good idea to move from one proprietary hardware system to another, because the result remains the same. In the best case scenario, you have to do the same amount of emulation and restoration effort every few years. In the worse case scenario you can even loose something in the meantime. This is why we try to avoid doing the same or equivalent work twice.

 

 Analytical schema of the sustained system architecture of Hans Diebner's Liquid Perceptron. Courtesy of Jürgen Enge.

 

PACKED: Is it like trying to define standards?

Tabea Lurk: No, it is the role of the archiving field to set the terms on how to qualify better or worse format, etc. We just try to apply their criteria to the questions on software-based art. We are trying to use their risk management habits, monitoring routines, and definition of quality standards. In general there have been many concepts and as well as tried and tested developments over the last few years to which one can adapt. They are really important for us.

But concerning the current accessible standards, which basically face static file formats, database protocols like for instance SIARD15 and so forth but not yet complex digital objects, they are especially good for research. Unfortunately with computer-based art you always get to a point where you don't get an answer for a specific need, so you have to go a step beyond. For us this is actually possible because we are based at an Art School and are not museum that has its own collection and therefore its defined preservation strategies and pre-defined results. Being at a University allows us to conduct basic research. Based on our case studies we have the possibility of going through scientific papers that are continuingly being produced around the world as well as being able to have a much closer look at specific questions. And we can start at a level we define and deem relevant – very often in terms of the continuation of problems that occurred unexpectedly in former research.

 

PACKED: How did you start working on case studies? What collections do the works come from?

Tabea Lurk: Based on our experience with the works at ZKM, we began to work with our own private CD-ROMs in order to find out how we could really save them and how they would work in an emulator. After that we started with artworks on the Internet. The two last case studies we did came from private collections. In Bern we are able to act as a sort of third party to make case studies when asked by the owners. AktiveArchive was funded by the State Department for Cultural Affairs with the aim of becoming a competence centre. Our job was really to develop this kind of knowledge, in order to support museums later. Johannes Gfeller16 works on the conservation of video and the preservation of vintage equipment.

The digital part is now based at the Bern University of the Arts in the ArtLab of the Conservation Department, where I work, as well as at the Karlsruhe University of Arts and Design where Jürgen heads the Research Department on Digital Archiving. People come to us if they have any questions about digital preservation and we support them or in complex cases try to find possibilities to apply for research fundings. Currently we are in discussions with for instance the Tate Modern and the Kunsthaus Zurich to work on two cases.

 

 Exhibition view of Regionale9 in [plug.in], Basel, 29.11.2008 - 04.01.2009. Courtesy of Jan Voellmy.

 

PACKED: How did your years at ZKM influence your approach to such cases? How did they prepare you for the problems you encounter?

Jürgen Enge: In the 1990s I was involved in discussions with the artists who created for instance CD-ROMs, programs and installations so I had a wonderful opportunity to see how these artworks were made. As an insider I had a good deep view into the applied technology, because at that time I was there as a program engineer and all these artists needed programming support.

This experience still provides me with an inside on what was produced in the 1990s in general. I can estimate what kinds of errors were made at the time, because we often had to deal with for instance the very slow speed of computers, trying to fasten them with weird hacks. This experience is still present when I look at an artwork today. There is like an unwritten but common set of solution strategies for widely known problems. And one can imagine which options were eventually applied for solving problems, etc.

When you analyse an artwork, former theories, which have come along at the time, allow you to estimate which solutions or status of software might have been likely. This really helps one later on to identify problems and the importance of specific software elements. From another point of view, one can still understand the different points of view that an artist might have had based on the strategy of implementation. Often artists just learned programming themselves, so there is not necessarily a certain theory / philosophy behind the software applications used – but sometimes there is – like specific open source applications, the constant use of the latest technology, or on the contrary, only out-dated materials, etc.

 

PACKED: In one of your texts you say "computers are determined as artworks when software is installed on them"? Does this also mean that without the hardware the software would still be the work?

Tabea Lurk: Not necessarily, you can´t say this as an isolated and general statement. There are cases where it is what the artist has touched that becomes the artwork. For example pieces by Cory Arcangel where he works with physical cartridges and their manipulation. They are really hard to preserve. You can't really encapsulate just the software and I wonder if you could ever succeed in encapsulating any of Cory Arcangel's work without afterwards having to use the same hardware again for display or have reduced the work just to what we call “documentation”. It would be nice if it could function technically, but without the hardware you wouldn't call it an artwork anymore.

Here the hardware is really important. But I agree, it depends on what kind of artwork you are looking for. Our focus until now has basically been on artworks for which the computer is exchangeable because it is just the running machine behind the artworks. It causes effects and keeps a certain logical concept moulded in software.

 

Super Mario Clouds, 2002-. Handmade hacked Super Mario Brothers cartridge and Nintendo NES video game system. © Cory Arcangel. Courtesy of Cory Arcangel.

 

Sometimes people think that the easiest way to preserve a computer-based artwork is to keep the hardware. You just buy a second Mac Mini for example and then you make a back-up image for security. I would say that it is an intermediate way of conserving a work preliminarily – a backup routine so to say. But what we try to do is to have some kind of second step and a concept of what happens if this hardware is no longer accessible. And this means one has to analyse, understand and document the artwork. Of course first of all you have to decide if the technical components have some sculptural relevance or not.

The same question applies to installation-based artworks, where you differentiate which components can be replaced and which not in order to keep some kind of original appearance, look and behaviour. Very often, you accept a certain kind of degradation for some elements, but not for others that should be kept. This very much depends on the artist.

 

PACKED: Are changes acceptable as long as they are outside of the capsule?

Tabea Lurk: I would like to answer this with a statement and two examples given by the famous contemporary art conservator, Erich Ganzert-Castrillo. Once, in a lecture, he compared a statement by Reiner Ruthenbeck with one by Katharina Fritsch. For Ruthenbeck the perfect surface of his sculptures relates to the concept of the artwork, which means that even the smallest scratches or damage of the surface require a repainting. The surface always needs to look perfect. In contrast, Katharina Fritsch comments alternatively: after thinking about “Tischgesellschaft” (1991) at Museum für Gegenwartskunst Frankfurt - and I can only give the gist of her statement - but she mentioned that even if she prefers the moment when things are beautiful and brand new; she must accept the ageing process. She refuses continuous cosmetics, which would remind her of the look of a woman’s face after countless liftings17.

 

PACKED: Is that the question of the patina in the context of computer-based art?

Tabea Lurk: Yes and it is a very tricky question. Even though we try to find known patterns of behaviour or handling and lean on “conservation ethics”, in order to see what might be adapted to digital questions, there are clear differences. It is therefore always challenging to find some kind of balance.

Jürgen Enge: For instance, in some artworks by Stephan von Huene the computer case as object has to be present and is part of the installation. It is an important aspect for its conservation because his works really address the shift or connection between analogue and digital. He used huge main boards on which he soldered the pins, because he really understood the procedure and sequence of actions on a very deep mechanical level. For instance in the “Table Dancers” (1988-1995), he designed the puppets and their choreography pin by pin, f. He developed electronic circuits based on a broad knowledge of robotics. To put it differently: He was an artist from the analogue world who later shifted to digital.

Tabea Lurk: Jürgen digitised parts of scripts and of the exhaustive documentation that Stephan Von Huene did when he was still alive. They contain a lot of the knowledge about the work, and will be part of the DCA18 project. In some cases the computer case is a part of the sculptural component, like if you think about Bruce Nauman’s “Raw Material – BRRR” (1990)19 where the original wrapping box of the video projector is supposed to be present in the installation box, functioning as a pedestal underneath the video projector. If you have such an important conceptual position for the computer or any other technical component the core question is, would you exchange it?

 

Bruce Nauman, Raw Material – BRRR, 1990 Installation's view of Raw Material – BRRR. Photographer: Schmitt, Bernhard © VG Bild-Kunst 2004

 

PACKED: But if it is just the case, there might be a solution to have it just as a sculptural object and not a functional machine anymore?

Tabea Lurk: Yes, it's a question that's often asked because computers are becoming smaller and smaller and you could hide a running system with an encapsulated artwork in the bigger casing of the older machine. However, if you have the same artwork in three or four collections like before, you often see that each collection might have chosen a different conservation strategy because it is also deeply related to the policy of the collection.

The question about the best way to preserve a digital artwork is a controversial discussion, and it is nice to be an art historian, because you don't need to judge it. I think it is very difficult to know what is better or worse because if you say that you will migrate, several people will tell you that migration is worse than encapsulation and so forth. We are really interested in maintaining the discussion as an on-going process and to develop a more sophisticated vocabulary and tools. In order to try to make statements, I think it is important that we work on a level that is closely related to the work. We are looking for routines for museums, which clarify what should happen initially when an artwork comes into the collection: That the artist's PC, for example, is not immediately brought to the exhibition to run for like 5 or 7 weeks until someone asks if a backup has been made.

 

PACKED: Are you developing these kinds of routine?

Tabea Lurk: Sort of. But one difficult aspect in explaining our approach is that we think that on the one hand people shouldn't be too afraid to touch digital artworks.... On the other hand, if you have an artwork based on acrylic glass you should ask a material scientist, how to work with that material. The same thing should be done with digital art. Even if everybody uses computers daily, it does not mean that we really understand them. At a certain point, it might be good to consult an information scientist and let him take a closer look at it. But I think that basic routines and backup should really be possible.

 

PACKED: Museums have this problem because working on digital art preservation requires a lot of different knowledge. Whom should the museum ask for getting support / help?

Jürgen Enge: I think the problem should be approached piece by piece: to identify the artwork on a preliminary level, which means being able to observe the artwork and to gather for instance external devices, identify them etc. This should be feasible to everyone and one should have charts or database sheets supporting it. But very often, these external devices do not necessarily really relate to the internal core of the work on a technical level. In order to understand the internal structure of an artwork and also to identify further steps for preventive conservation or restoration, a museum should ask these questions to different people.

At one hand you need for example the knowledge of an art historian who knows how to understand the concept and address questions related to authenticity. On the other hand, there are technical questions, where knowledge of the logical structure of program architecture is required. That is when an information scientist or a program engineer who understands the theory of computers should be contacted. Such a person is also the one who might later have an idea about what is likely to happen in the future, if such technology no longer works.

 

 Scheme characterizing the four different core areas (dimensions) of the work logic. Courtesy of Tabea Lurk.

 

PACKED: So the museum has to go outside the museum to find help in different communities through some kind of networking?

Tabea Lurk: Yes, but there are specific routines that can also be maintained by museums themselves, especially those huge museums that have an IT department. They can get a short introduction on how to make a proper back-up copy and first aid procedures so that they can start to work in-house. Several points are similar to classical conservation science. One big difference in contrast to material-based artworks is that digital artworks are not singular units or physical objects anymore. Problems can therefore be split. For conservation purposes it is not enough to know how to handle a computer or how to operate the software. You have to document the required parameters in order to adjust the artwork to a specific place. These parameters could for instance be the camera-calibration of a video surveillance system.

For preservation purposes it is necessary to go beyond those steps. I don't mean that conservation is in general too difficult - networking is a good strategy because both sides can learn from each other – but to understand the work logic of an artwork is often way beyond current conservators experience and mathematical knowledge.

Jürgen Enge: You also have to distinguish preserving a digital artwork from repairing it. Very often artworks are just repaired. If you repair a car you just exchange parts, and even if you can drive the car again afterwards, you wouldn't say that you have preserved the car for the future.

Tabea Lurk: In a recent discussion, we came to the point that artists who really work in the medium of computer or software based art are very often extremely aware of what can go wrong. They really try to deliver the artwork already adjusted in a way that it is ready for display purposes: the kiosk mode is activated, certain aspects are hidden etc. You just need to find the file that you really need to start on the desktop if the artwork does not automatically go into display mode. We need to learn to accept this for display purposes, but we also need some kind of a second line for preservation purposes, and viewings, for which an executable, for instance, is not enough. This second line can very often be slower; you have to ask for the source code, all related assets of an artwork and every kind of question that the Matters in Media Art20 questionnaire contains. To put it another way: I think that display and archival long-term preservation are two different procedures with separate approaches.

 

PACKED: That is where some kind of agreement has to be made with the artist as well?

Tabea Lurk: Yes, and it can be interesting for both sides, because artists can't preserve all their works on their own.

 

PACKED: How did you begin to use virtualization?

Tabea Lurk: What first brought it to the fore was the need to show a network based artwork offline. It was a piece related to the Xcult server21 an important node in Switzerland which appeared at the same time as the Rhizome Art Base. Many of the artworks on this server were just displayed online. A special edition of web-projects was related to this web-project: “Shrink to Fit”22.

Anybody can get hold of a small script, which functions as some kind of entrance and navigation control for the series, to use on his/her own website. Originally, when the project started, there was a new piece displayed once a month or every second week, I believe. It was a sort of collection of works with the idea of an exhibition on cultural institutions and all kind of other people's web pages. It was a piece that fitted on very small sizes of screens.

Technically it wasn't so difficult to have it running offline, because the interaction components were based on the script and no network interrelation with Google or any web-services was required. The first thing we did was to put it on a CD-ROM to make it portable. Then we used an Apache web server on a normal desktop computer in order to have a server like communication of the client.

As it was originally meant to be displayed from the network, we really wanted to use network communication (and not just opening of HTML-pages with a local browser), even if the piece could be shown without one. The problem with the CD-ROM solution was certainly that it is a read-only medium. It is burned once and if the system needs any kind of temporary memory, which is normally written on the hard drive and RAM, there is no such option on a CD-ROM. That is why we started to work with virtualization: in order to have some option for interaction changes or at least temporary memory stored in the RAM.

 

<i>Shrink To Fit</i>'s Flash interface on xcult.org.

 

Jürgen Enge: What we also managed to preserve with virtualization was the URL or domain name, which is often very important and part of the artwork.

Tabea Lurk: We also thought about the browsers used from 1995 to 1997. We wanted to have some kind of browser gallery. As you need to have the older systems to use the older browser, we though that virtualization was – at the current state – a good solution. For each operating system we can build software libraries and make it accessible. Then, we can make an image23 of this with the artwork inside the virtual machine, and that might be a tool to keep it portable. Later on, came the question of using virtualization for a CD-ROM library, in order to find starting points from which the artwork always starts.

Jürgen Enge: Our CD-ROM library does not insert CD-ROMs; it inserts an operating system with a status. The question is not how to put a CD-ROM into a system anymore, as we can just start an operating system with a snapshot24.

 

PACKED: So the user doesn't have anything to do with the virtualization software itself?

Jürgen Enge: No, and that was another reason for using virtualization solutions, as they can be remote controlled. This is an important point for us, because we don't want the user to follow such steps and go into the virtualization menu. You have to remote control this via the network for example, and build a system, which allows you to use perhaps a midi device to start virtual machines. The graphic port25 and the mouse are used for the display of CD-ROMs. You can't trust the use of the keyboard functions as they belong to the virtual machine. It is however possible to use a midi device like a drum, for example, that might start the virtual machine when you hit it. It could even be possible to slow the virtual machine down with a midi fader, and make it run slower or faster.

Tabea Lurk: We want the more complex, technical parts to be more or less hidden, because the user is often not willing to know about them. Moreover, it could really destroy the artworks, if the access is too complex.

 

PACKED: What virtualization and emulation software do you use?

Jürgen Enge: We decided to use VMware workstation because it has a very good API for remote control and a very stable driver sub system. This offers us drivers for sound cards on Windows 98 for example. Other virtualization systems like VirtualPC of Microsoft and Virtual Box of Sun Microsystems / Oracle have many more network problems, driver problems, etc. The choice of an emulator really depends on the requirements and the conservation concept. For example, we have had cases from the Demo Scene26, were we have used DOSBox27 because we could slow it up and down remotely. For these types of works it was a very good solution. But I'm also using QEMU28 a VHDL29 interpreter, which from a theoretical point of you is the perfect emulator. As in praxis there is occasionally a driver problem with DOSBox, then QEMU is better.

Tabea Lurk: When we started to use virtualization, there were a lot of issues when using e.g. sound cards or interaction with it. As a result we often combine virtualization and emulation to make sure that we can cover all requirements and specific configurations.

 

PACKED: So it is not like virtualization would technically work better than emulation?

Jürgen Enge: No, sometimes you really need to combine it, which sometimes depends on the power of the computer that you are using. For instance, if you emulate a software-based artwork that is based on costly graphics or something which is more like a vintage DOS game, then you need a very fast machine. Virtualization needs quite a fast host machine. Sometimes, these new computers that are about 10 cm in size are too slow for virtualization. As such the virtualization application is deeply related to the current technical standard of the specific hardware you are using. Again you are facing hardware problems but we hope that we will soon have small or more powerful computers in order to enable emulations that then might last for several years.

 

PACKED: How far does your work on web-based art relate to virtualization?

Jürgen Enge: For net-based artworks we normally use server virtualization, not the before mentioned workstation virtualization. We have our own VMware ESX infrastructure, which can easily be applied for the server part, because there are normally no needs for specific hardware interfaces. Furthermore, due to security issues, on the (art) server's part we are trying to have current systems that aren't as old as the work core. If we have access to all required components of the systems, we have the possibility of building up a virtual network – which is possible with this VMware infrastructure. It also has special firewalls. Furthermore we can have different instances or units of our Netart Router30, which is capable of going into the server for deep package inspection 31.

Tabea Lurk: Our work on Internet based art is a lesser part of our work with virtualization. For works using the network, we had the idea of using proxies as a software layer. We mostly made documentation sessions with the Netart Router, so what we have is more recorded contexts.

For the conservation of the server or the network-based parts of the artworks we use the Netart Router. As we explained with the work logic we don't want to touch the artwork and very often we have pieces, which are inter-related to web services like Google, or others. For such we have a different conservation strategy because we want the work just to be running as it is. Usually if Google changes its API32, the artists adapt their code but this isn't what we want. We also thought that if a museum had five different works using the Google API, all of them would need to be adjusted, which is just not what we want. So we put the Netart Router between the artwork and the network in order to have a routine and an interface, so that we could for instance adjust midi-fied APIs or even – according to the artists' suggestions – replace for instance the Google-Web service if it no longer existed, with what might be used in the future.

 

VMware ESX Server virtualizes server storage and networking, allowing multiple applications to run in virtual machines on the same physical server. ©VMware.

 

PACKED: How exactly do you use the Netart Router?

Tabea Lurk: The Netart Router has three main roles. The first one is analysis and documentation: we can recall and keep a record of the communication, based on the protocols that are exchanged when a work is used. The second function is to archive or store records and thus provide access, if certain parts are no longer functional. The user of an interactive web based artwork, who is for instance experiencing the artwork by searching for a specific term, could see a combination of still real-time generated live sessions and stored material from the archive in parts which are technically no longer supported (like out-dated Web-services, web-cams etc.). In such a way you can map the changes in a network environment. And the third function is a specified proxy.

Jürgen Enge: To give an example: At the moment we are thinking about ways to deal with IPv6.33 As most of the artworks until now apply IPv434, we will have to somehow 'proxy them'. If an artwork is running in an older operating system you will never get IPv6 running with a Windows98. We will need to have an in-between function. Actually we are migrating our VMWare and the whole network infrastructure to IPv6. So we have a stack-infrastructure where we can really address both IP versions (6 and 4).

Our target in the near future is to have a clean IPv6 infrastructure that offers a bridge to our IPv4 environment. Eventually in about five years there might be institutions with no old IP addresses anymore. But if they own older artworks which require IPv4 or vintage browsers etc. in order to deliver the old web pages, we will have to find a way of dealing with the difference or bridging between IPv6 to former windows 98 browsers working with IPv4. Thus the reason for looking for additional functionalities or proxies is to have a comparable “onion system”, as mentioned before with the work logic but on the network level. Even if it seems really distant – and seemingly on the periphery of preservation questions – it seems important from our point of view.

 

PACKED: In this case you could keep the work in an IPv4 configuration, which is also part of the historical and technological context of the work.

Tabea Lurk: With net-based artworks, the artist often maintains an artwork for a long time – which can cause substantial change within a piece. The question was: how is it possible to allow the artist to continue changing their works, as it is really her/his wish to have it working in the long term? On the other hand it is also important for the museums to finally have some kind of collection in the end of "certain states", which allows them to go back to a former version look. If the artwork is interactive, and grows with the interaction of the users, the museum might want to restore an earlier stage. Virtualization allows for this kind of resetting by having several snapshots. It's the same with the concept of the Netart Router, where you have certain caches which are dated in order to be able to see for instance the session from 2002 or from 2011 and so on.

Whereas the Wayback Machine at Internet Archive has a lot of material that is wonderful, there are unfortunately also a lot of gaps, and the display of many websites is no longer maintained. At this point our current research questions and options to guarantee the original look and appearance of these websites is via browser emulation.

Jürgen Enge: In addition to browser display you have to profile the network speed. In former times modems worked with varying speeds. Even if velocity is the easiest part to control, like as for instance on Linux you can trigger the velocity with IP configuration; the original speed has to be researched.

Tabea Lurk: Speed configuration can also be controlled with the Netart Router.

 

PACKED: Is the network speed an important factor for net-based artworks?

Tabea Lurk: In a work from 1996 by Olia Lialina, My Boyfriend Came Back From War, this really was the case. In a presentation in Karlsruhe, Olia Lialina mentioned that the original feeling of the work was gone, because at that time, you were really waiting and wondering what would happen next. Now, everything goes faster and comes up without the tension that was present for former users. She has now restored the work, not especially for the speed issue but because of additional problems that had arisen. The case study is available on the INCCA website.35

 

Screenshot of <i>My boyfriend came back from war</i>, 1996, Olia Lialina.

 

PACKED: What kind of limitations have you encountered when using virtualization?

Jürgen Enge: Sometimes it is just limitations related to the internal performance of the computer. If you have an artwork that really needs a wonder PC with quad-core processor or core I7, then you just can't virtualise it. It is the same problem when a work is based on a very new gaming engine - because most of the time you need a really fast graphics card. That is the reason why until now, it is very often impossible to virtualize them.

Tabea Lurk: There are features, which are not yet supported in the way that we would like to use them; especially the option to slow down a system in a way that not only visual things are displayed with the proper velocity, but also the sound. Its hard to slow down sound because we are much more sensitive to it. And if the virtual machine's speed is correct, the sound goes jerky "tac tac tac"…

 

PACKED: What problems related to copyrights did you encounter?

Tabea Lurk: Well this is a problem that we really need to face, and I think we should have something like they have in the U.S., where archives are allowed to go beyond certain rights limitations for preservation purposes, for instance for preserving games or CD-ROMs for instance, that are protected by passwords or DRM technology.36

 

PACKED: There are also rights issues when you work with emulation?

Jürgen Enge: Yes, for Mac products it is very tricky because you are not allowed to copy the BIOS.37 For Windows we have contacted Microsoft and there are no problems. You can use any older versions of Windows, but they won't provide you with help if it doesn't work properly. For Apple computers, the ROM38 is the main problem, because if you touch it, you really enter an illegal situation. I don't know how to handle it legally.

Tabea Lurk: Works using Apple computers cause a lot of problems, because virtualization is not really supported and as Jürgen said, you are in a grey zone concerning legality when using a Mac Emulator. Technically it is much easier to get virtualization software or emulators for all kinds of Windows and Linux OS than for Apple OS. It is a pity, because lots of artworks are using Mac machines. For artworks made in Max/MSP running on an Apple computer it is very difficult to get rid of the Mac environment or even to get a sustainable encapsulation.

Jürgen Enge: It is possible to emulate old Apple computers, but it is quite hard to emulate a newer Mac OS. There are ways to emulate Mac OSX server versions, but they are not made for high-speed graphics, so it is very problematic.

 

PACKED: Aren’t there other solutions for works using Max/MSP than emulation?

Tabea Lurk: I still hope to get the Max/MSP works we are working on to be translated or migrated to Pure Data at a certain point. But I'm not sure if we will succeed. There is a great number of Max/MSP works, just because Max/MSP is such a nice tool with which you can do so many different things, and a lot of artists use only that software.

 

PACKED: What kind of works, which are conserved at the moment, might become problematic to preserve in the future?

Tabea Lurk: There is a whole kind of new works coming up using mobile devices and smartphones that are easy to get hold of, but are very problematic in that they rapidly become obsolete.

Jürgen Enge: The first emulation software for smart phones are beginning to appear. They use different software environments for business and for personal/private application. So emulating smartphones won't be a problem, but you won't have the interfaces anymore. What you will have is an emulation running on a PC. It is very difficult to deal with works that are based on specific hardware. If you have e.g. an artwork that is based on a Fireface sound sub-system which is a forty audio channel device, I don't know what should be done if it is not possible to get hold of the same hardware anymore.

 

PACKED: Hardware obsolescence will continue to be a problem in the future.

Tabea Lurk: Obsolescence will continue to be a problem with works using smartphones. Just to mention a historic example, Lynn Hershman used a LaserDisc system in her work “Lorna”. The user navigates with a remote control device and the navigation - including the image of the remote control - is visible. So the hardware is really important as well as the mapping of the numbers on the remote control with the ones on the image, etc.

 

Installation view of <i>Lorna</i> (1983-1984) by Lynn Hershman Leeson. © Lynn Hershman Leeson.

 

Videostill of <i>Lorna</i> (1983-1984) by Lynn Hershman Leeson. © Lynn Hershman Leeson.

 

PACKED: Could you describe your documentation methods, do you have specific routines for this?

Tabea Lurk: It is basically organised according to the guidelines of conservation science, containing the documentation of the current state of the artwork, the examination concept, including the treatment planning, and then the documentation of the treatment. We start with the identification of the hardware components if they are relevant which means that the documentation is not complete in terms of everything from the OS, Software-Libraries etc., but rather hierarchically structured according to importance, which necessarily indicates the functionality covered by the component(s). There is a clear structure that normally goes down to the file level.

You need to take a quick look into the file, and note down at what point the code calls which software library or preferably which file from the required library, in order to do certain functions. For instance ImageMagick39 is often used, and it does not only tell you that ImageMagick is used in what specific version but also what kind of function is recalled in order to generate the images.

 

PACKED: And you also document the changes that you make?

Tabea Lurk: Yes. For example a fresco with blemishes, or where some of the colours fell down, is often reconstructed in such a way that you can still recognize what is original and what was restored. With computer, and especially software-based artworks, in order to maintain access to the former appearance of the work and for it to be readable, you might complete the missing parts but keep it visible. In our practice the documentation of the "treatment" is done similarly but on the code level.

 

PACKED: Is this code documentation with comments always done the same way?

Jürgen Enge: It depends heavily on the programming language, because for instance some, like C40 and C++41, have pre-compilers. The old programming languages used to have ways of commenting. Your comments just need to be clearly identified and not mixed with the type of comment the artist made. If necessary, there should be a way of getting rid of all changes that you've made and getting back to the original code through an automatic procedure. This is only possible when you have commented correctly – and it is possible with all programming languages.

Tabea Lurk: Concerning the structure of the documentation you could for example start from the core file and then go up and down so that the structure and way the files are commented and presented in your documentation should later represent the hierarchical structure of the work's organisation on the code level. You need to know the function and the interrelations of each file or part of the script. Interrelations go to higher or lower levels - facing the system environment. The different kinds of sequence or procedure need to be clearly structured and described.

The documentation does not necessarily start at page 1 and continue straight on until page 12. But different parts might rather be added later – whenever additional information occurs. Once the documentation is finished, which should give the next person working on the piece a clear idea about what the artwork is and what happened with it previously. Our documentations also often contain screenshots of the code in order to make sure that the parts we inserted to have a stabilized or preserved version of the work are differentiated from the original code. Comments that are not the artist's should also be clearly identified in the new version.

Jürgen Enge: After reading the documentation of a work, you should be able to tell whether the preservation is good or not.

 

PACKED: Is documentation a very time consuming process?

Tabea Lurk: It depends on the artwork, but normally we combine the documentation work with the preservation. If for instance you migrate the piece onto a virtual machine - in order to see if you have everything running and all the required elements, it really makes sense, as prescribed by the ethical guidelines, to document the artwork in parallel. Then you can double check.

Jürgen Enge: The time you spend on the documentation also depends on how familiar you are with the specific programming language used to make the work.

Tabea Lurk: It also depends on how far you go into documenting a work, and if you need to have all the files commented. I for instance documented extensively Marc Lee's work, "Breaking the news" that accesses fourteen web services. I opened each script and commented each of them to see where it went and which library it addresses, etc. The documentation could be done much faster and I can imagine that at a certain point someone will develop a sort of web interface allowing you to click and access the file as if it were a kind of linked website.

 

PACKED: What tool do you personally use?

Jürgen Enge: Internally, we are using MediaWiki for documentation because you can easily interlink things. As the same problem can appear multiple times in different artworks we can link them. MediaWiki also has many plug-ins to display a beautified programmed code for example. Another nice feature is the fact that all the previous versions are archived, so if someone has changed something, we can go back to a previous version and see who has changed what. From our experience, a wiki and especially MediaWiki, is a very good tool for such documentation.

 

Installation view of <i>Breaking The News</i>. Courtesy of Marc Lee.

 

PACKED: In the coming years how do you think museums will be able to deal with all these computer-based works?

Tabea Lurk: In general I'm very optimistic, because new technological solutions do not only cause problems they also give new ideas and we should not only use information technology as a supporting science. I have the feeling that if we could learn more from that field, we could solve problems with solutions that it has developed. It is just a matter of knowledge transfer and application and I would appreciate it if we could go a step beyond what we are used to. Especially in the German context, our guidelines mainly result of long-term archiving or self-teaching without really studying information science and mathematics. I think that we work in a very dynamic field but currently we are not free to develop our ideas, as we would like to.

 

PACKED: Are you also optimistic for works using resources like YouTube, RSS feeds, etc. or even located in virtual worlds like Second Life?

Tabea Lurk: I think that strategies to archive web 2.0 communities already exist. It will be interesting to see how they do it and how far archives in current cultural institutions will be able to build bridges. For sure some information will be forgotten, and a lot of material will be lost or only documented, but there will be a kind of plurality. It is hard to decide where it will go; this kind of works has its kind of own cybernetic life and development. In the future I hope that the society and a new generation of cultural workers will be much more open to this kind of technological development.

Technologically I have the feeling that it might become easier at a certain point. We mentioned earlier that with smart phones we shall once again have to at the question of the hardware's technological obsolescence, but at the same time we are on our way to the cloud and in cloud computing42 you do not rely so much on the physical thing any more. Also, if HTML 543 gets standardised, then it will be much easier to maintain data because HTML 5 is clearly structured and one can even see vector graphics or TIFF-images, which couldn’t be displayed with former browsers. Video is embedded and displayed directly, etc.

 

PACKED: So a constant monitoring of new technologies is needed? Is it something you do?

Jürgen Enge: In computer-based art preservation, you invest a lot of work in preserving old objects and artworks. Many of the problems we have originate from the fact that we have to understand old software, old computers, etc. So if you look at it from a technical point of view, I don't want to be behind anymore. My strategy is to think how to preserve what will be made in the future. That is the reason why we are for example investigating in IPv6 and building browser based applications, which will appear during the coming years.

I want to know how to deal with this type of technology before it actually appears. I want to examine the kinds of problems ahead of time and build solutions for future problems. I don't want to have a large amount of issues that I cannot deal with and which are getting more problematic because the technology is no longer supported or because one would need to store vintage equipment. Quality isn’t improving, because of the fact that I'm only working with older materials. I really want to investigate further into future problems and make sure that we will have a chance to deal with them.

Tabea Lurk: We have made some small achievements, but it would be wrong to say that there is a systematic approach for all the problems and a strategy behind solving all kinds of requirements. There are still a lot of gaps, which need to be filled.

 

 

Notes:

 

  • 1. The research project AktiveArchive, is an initiative of Bern University of the Arts (BUA) in collaboration with the Swiss Institute for Art Research in Zurich (SIK/ISEA). See: http://www.hkb.bfh.ch/de/forschung/forschungsschwerpunkte/fspmaterialita....
  • 2. ZKM holds a unique position in the art world; it is an interdisciplinary research institution focusing on new media. Since its opening in 1997, the ZKM has become an important platform for the production and exhibition of contemporary art and emergent media technologies. Since 1999 the institute is led by the artist, curator and theoretician Peter Weibel. See the interview on this website with Christoph Blase who he is running the Laboratory for Antique Video Systems at the ZKM: https://www.scart.be/?q=en/content/interview-christoph-blase-zkm
  • 3. Media Museum. See: http://on1.zkm.de/zkm/e/institute/medienmuseum
  • 4. Institute for Net Development. See: http://on1.zkm.de/zkm/e/institute/Netzentwicklung/
  • 5. An Open Archival Information System (or OAIS) is an archive, consisting of an organization of people and systems, that has accepted the responsibility to preserve information and make it available for a Designated Community. The term OAIS also refers, by extension, to the ISO OAIS Reference Model for an OAIS. This reference model is defined by recommendation CCSDS 650.0-B-1 of the Consultative Committee for Space Data Systems; this text is identical to ISO 14721:2003. Source: Wikipedia.
  • 6. The interdisciplinary project GAMA – Gateway to Archives of Media Art was launched on 2007-11-01 by 19 participating organisations from Europe's culture, art and technology sector, with the aim to establish a central online portal to different European media art collections for the interested public, for curators, artists, academics, researchers, and mediators - an endeavour supported by European Commission within the framework of the econtent plus programme.
  • 7. See: http://performative-science.de/liquidperceptron2d.html
  • 8. See: http://www.v78.org/
  • 9. See: State of the Art and Practice in Digital Preservation; http://web.archive.org/web/20100527173801/http://nvl.nist.gov/pub/nistpu...
  • 10. A run-time system (also called runtime system or just runtime) is a software component designed to support the execution of computer programs written in some computer language. Source: Wikipedia.
  • 11. In software engineering, a design pattern is a general reusable solution to a commonly occurring problem within a given context in software design. A design pattern is not a finished design that can be transformed directly into code. It is a description or template for how to solve a problem that can be used in many different situations. Source: Wikipedia.
  • 12. In computer networks, a proxy server is a server (a computer system or an application) that acts as an intermediary for requests from clients seeking resources from other servers. A client connects to the proxy server, requesting some service, such as a file, connection, web page, or other resource available from a different server. The proxy server evaluates the request according to its filtering rules. Source: Wikipedia
  • 13. The bridge pattern is a design pattern used in software engineering, which is meant to "decouple an abstraction from its implementation so that the two can vary independently". Source: Wikipedia.
  • 14. The TCP/IP model describes a set of general design guidelines and implementations of specific networking protocols to enable computers to communicate over a network. TCP/IP provides end-to-end connectivity specifying how data should be formatted, addressed, transmitted, routed and received at the destination. Protocols exist for a variety of different types of communication services between computers. Source: Wikipedia.
  • 15. The SIARD format (Software Independent Archiving of Relational Databases), was developed by the Swiss Federal Archives (SFA) and is part of the digital archiving platform of the European research project Planets. The format specifications are open and published. Source: https://www.bar.admin.ch/bar/en/home/archiving/tools/siard-suite.html
  • 16. See the interview with Johannes Gfeller.
  • 17. See: http://cool.conservation-us.org/waac/wn/wn21/wn21-2/wn21-208.html
  • 18. See: http://www.dca-project.eu
  • 19. See: http://www.medienkunstnetz.de/works/raw-material/
  • 20. See: http://mattersinmediaart.org/
  • 21. See: http://www.xcult.org/
  • 22. See: http://www.hub3.org/forsch1/forsch/shrink_list.html
  • 23. A disk image is a single file or storage device containing the complete contents and structure representing a data storage medium or device, such as a hard drive, tape drive, floppy disk, optical disc, or USB flash drive. A disk image is usually created by creating a complete sector-by-sector copy of the source medium and thereby perfectly replicating the structure and contents of a storage device. Source: Wikipedia.
  • 24. In computer systems, a snapshot is the state of a system at a particular point in time. The term was coined as an analogy to that in photography. It can refer to an actual copy of the state of a system or to a capability provided by certain systems. Source: Wikipedia.
  • 25. The Accelerated Graphics Port (often shortened to AGP) is a high-speed point-to-point channel for attaching a video card to a computer's motherboard, primarily to assist in the acceleration of 3D computer graphics. Source: Wikipedia.
  • 26. The demoscene is a computer art subculture that specializes in producing demos, which are non-interactive audio-visual presentations that run in real-time on a computer. The main goal of a demo is to show off programming, artistic, and musical skills. Source: Wikipedia.
  • 27. DOSBox is emulator software that emulates an IBM PC compatible computer running MS-DOS. It is intended especially for use with old PC games. DOSBox is a free software. Source: Wikipedia.
  • 28. QEMU is a processor emulator that relies on dynamic binary translation to achieve a reasonable speed while being easy to port on new host CPU architectures. In conjunction with CPU emulation, it also provides a set of device models, allowing it to run a variety of unmodified guest operating systems; it can thus be viewed as a hosted virtual machine monitor. It also provides an accelerated mode for supporting a mixture of binary translation (for kernel code) and native execution (for user code), in the same fashion as VMware Workstation and VirtualBox. QEMU can also be used purely for CPU emulation for user level processes, allowing applications compiled for one architecture to be run on another. Source: Wikipedia.
  • 29. VHDL (VHSIC hardware description language) is a hardware description language used in electronic design automation to describe digital and mixed-signal systems such as field-programmable gate arrays and integrated circuits. Source: Wikipedia.
  • 30. The Netart Router. A preservation tool for analyzing, documenting, archiving and displaying internet based art and dynamic web content. See: http://www.hkb.bfh.ch/de/forschung/forschungsschwerpunkte/fspmaterialita...
  • 31. Deep Packet Inspection (DPI) (also called complete packet inspection and Information eXtraction - IX -) is a form of computer network packet filtering that examines the data part (and possibly also the header) of a packet as it passes an inspection point, searching for protocol non-compliance, viruses, spam, intrusions or predefined criteria to decide if the packet can pass or if it needs to be routed to a different destination, or for the purpose of collecting statistical information. Source: Wikipedia.
  • 32. An application programming interface (API) is a source code based specification intended to be used as an interface by software components to communicate with each other. An API may include specifications for routines, data structures, object classes, and variables. Source: Wikipedia.
  • 33. IPv6 (Internet Protocol version 6) is a version of the Internet Protocol (IP) intended to succeed IPv4, which is the protocol currently used to direct almost all Internet traffic. Source: Wikipedia.
  • 34. Internet Protocol version 4 (IPv4) is the fourth revision in the development of the Internet Protocol (IP) and the first version of the protocol to be widely deployed. Together with IPv6, it is at the core of standards-based internetworking methods of the Internet. IPv4 is still by far the most widely deployed Internet Layer protocol (As of 2011, IPv6 deployment is still in its infancy). Source: Wikipedia.
  • 35. See: https://www.incca.org/articles/project-agatha-re-appears-net-art-restora...
  • 36. Digital rights management (DRM) is a class of access control technologies that are used by hardware manufacturers, publishers, copyright holders and individuals with the intent to limit the use of digital content and devices after sale. DRM is any technology that inhibits uses of digital content that are not desired or intended by the content provider. Copy protection which can be circumvented without modifying the file or device, such as serial numbers or keyfiles are not generally considered to be DRM. DRM also includes specific instances of digital works or devices. Companies such as Amazon, AOL, Apple Inc., the BBC, Microsoft and Sony use digital rights management. In 1998 the Digital Millennium Copyright Act (DMCA) was passed in the United States to impose criminal penalties on those who make available technologies whose primary purpose and function is to circumvent content protection technologies. Source: Wikipedia.
  • 37. In IBM PC compatible computers, the basic input/output system (BIOS), also known as the System BIOS or ROM BIOS […] is built into the PC, and is the first code run by a PC when powered on ('boot firmware'). When the PC starts up, the first job for the BIOS is the power-on self-test, which initializes and identifies system devices such as the video display card, keyboard and mouse, hard disk drive, optical disc drive and other hardware. Source: Wikipedia.
  • 38. Read-only memory (ROM) is a class of storage medium used in computers and other electronic devices. Data stored in ROM cannot be modified, or can be modified only slowly or with difficulty, so it is mainly used to distribute firmware (software that is very closely tied to specific hardware, and unlikely to need frequent updates). Source: Wikipedia.
  • 39. ImageMagick is an open source software suite for displaying, converting, and editing raster image files. It can read and write over 100 image file formats. ImageMagick is licensed under the Apache 2.0 license. Source: Wikipedia.
  • 40. C is one of the most widely used programming languages of all time and there are very few computer architectures for which a C compiler does not exist. C has greatly influenced many other popular programming languages, most notably C++, which began as an extension to C. Source: Wikipedia.
  • 41. C++ is one of the most popular programming languages with application domains including systems software, application software, device drivers, embedded software, high-performance server and client applications, and entertainment software such as video games. Source: Wikipedia.
  • 42. Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a metered service over a network (typically the Internet). Cloud computing provides computation, software, data access, and storage resources without requiring cloud users to know the location and other details of the computing infrastructure. Source: Wikipedia.
  • 43. HTML5 is a language for structuring and presenting content for the World Wide Web, and is a core technology of the Internet originally proposed by Opera Software. It is the fifth revision of the HTML standard (created in 1990 and standardized as HTML4 as of 1997) and as of February 2012 is still under development. Its core aims have been to improve the language with support for the latest multimedia while keeping it easily readable by humans and consistently understood by computers and devices (web browsers, parsers, etc.). HTML5 is intended to subsume not only HTML 4, but XHTML 1 and DOM Level 2 HTML as well. Source: Wikipedia.
interview_tag: 
logo vlaamse overheid