Add to Collection


the annual student exhibition of VCD and POV departments of İstanbul Bilgi University.
Track 08

Data fragmentation

The annual student exhibition of VCD and POV departments of İstanbul Bilgi University.



Also this print's another version selected for OFFF2010 catalogue.

for catalogue

The Free Oscillation of Fragmented Public Data

Internet has a structure that allows interconnectedness of all nodes, peertopeer sharing and mesh networking models. Yet, server based networkswhere digital data are accessed from a single centralized source maintain their prevalence. Within server-based networks, digital data can be accessed as long as the server continues to serve it. In order to ensure the permanent accessibility of public data, we used to need to personalize it in our private space. But personalizing public data in this fashion prevents us from taking
advantage of the possibilities offered by the nature of the Internet.

Some time ago, optical disks replaced the magnetic tapes that once constituted the sole private data storage. Today we use hard disks for data storage thanks to developments in their capacity and portability. Archiving methods, which include cataloguing, transferring and shelving of the data, and mounting it at the occurrence of a request, preserve their classical functions, in spite of the changes in the form of data storage space.

As more private data is being shared on public networks, the field of public data expands as well as the volume of data that comes out. Concurrently, the augmented quality of digital data leads to the expansion of its size. As we personalize more data to ensure its permanence, the direction and access of our data archives become more problematic. As a result these, the requirements of storing public data personally has reached to such an extent that they cant be provided individually. So, before we allocate the required storage space to
store our data, it ages, and new data emerges. In addition to these factors, our inability to guarantee the security and physical durability of storage space continues to threaten the existence of digital data. Thus, it is pointless to struggle for storing data the permanence of which cannot be guaranteed in the first place.

Today, thanks to developments in bandwidth and increases in the number of networks and in the
 number of seeds; classical data storage is being replaced with a new habit: The Free Oscillation of Fragmented Public Data.

Data became transferable between interconnected clients who become servers for one another within the cloud that makes up the Internet. Data that is requested by more clients gain a larger pathway within the cyberspace. Instead of obtaining the entire data from a single source, data can be seeded within
packages and the most near packages can be mounted via Open Shortest Path First (OSPF). This process shortens the downloading time incredibly. Contrary to the working of server-based networks; within this model when a peer-topeer connection breaks, the peer can continue harvesting data form other peers. Instead of storing public data, we can capture it from this cloud where it constantly circulates at a high speed. Accordingly, we obtain it from peers whenever we need it and serve it to other peers while we have it. Furthermore, this situation creates a natural balance where the most wanted data will be the most
speedily available.

In this model where the circulation is thus free and dense; obtaining public data from lots of seeds instead of a single source does not cause the transfer of electronic viruses along with the requested data. In  order for a virus to spread, its a requisite that it spreads to all the peers simultaneously. When a peer
gets infected, because of the mutation the virus creates in the structure of the data the peer is seedind, the infected seed cannot match with other peers and is taken out of circulation. A peer that has virus-infected data doesnt become defunct; instead the peer continues to serve the uninfected data s/he has, and refreshes the infected data from healthy seeds. Unexpectedly, even if the data has virus during its first oscillation, its rate and the number of peers it distributes to do not increase and, thus, it eventually dies out. So this model has
a rare self-defence and self-healing capacity that we can consider organic.

Data with a decreasing rate and with falling number of peers that serve it is under the threat of dying out by extinction like virus-infected data. But we can consider this, a process of natural selection. It can be argued that the data that gets lost over time is not needed anymore. As a result of the self-elimination implied by this model, junk data doesnt create noise within the cloud. In short, this model provides self-cleaning in addition to security. Nevertheless, private or public data warehouses can be established to store data we want to keep and that is on the brink of extinction; to be accessed upon necessity. Thus, the requirements for data storage will be minimized.

In this model, where the existence of data depends on its circulation rate, it is not really possible to put back data into the loop if it runs out of circulation; neither is it easy to take data out of circulation, once its in the circuit. For this reason, private data that should be kept out of circulation shouldnt contact the cloud at all. Otherwise, like the software that spreads out before its official release by its manufacturer or homemade sex tapes of famous people that leak out involuntarily, data that gets in touch with the cloud has the risk of living forever. To makes matters worse, once the data is digitalized, its almost impossible to ensure its total isolation from the cloud. In this situation, its very difficult to preserve data personally, and all the conservative efforts lose their meaning.

Yet, we are sharing more of the information we used consider private until recently, such as our real names, the people were in relationship with, what we are doing at the moment or our holiday pictures on social networking platforms such as Facebook, YouTube and Flicker. This voluntary release of private data
into oscillation shows that the field of public data is expanding as a result of the wishes of the actors involved, in addition to involuntary leaks of private data. Although, classical archival methods continue to be useful for private data, the expansion of the field of public data makes the personal storage of public data meaningless, as any data we request is available within the circuit at high speed. The only thing we need to do is to temporarily obtain the data we need.

Mustafa Ercan Zırh
İstanbul Bilgi University,
Department of Visual
Communication Design.