Autonomía digital y tecnológica

Código e ideas para una internet distribuida

Linkoteca. infraestructura digital


Many movements throughout history have looked to an imagined past, and indeed actively constructed an idea of the past, in order to envision a better future. And often there’s a lot of political utility in making people feel as if they’ve lost something—a set of rights, a set of freedoms—that they now need to reclaim. Even if it’s not entirely clear if those rights or freedoms existed.

It’s also important to point out that internet nostalgia is a constant of internet history.

What if we could feel nostalgic not really for those previous eras of the internet that the onward march of privatization has obliterated—whether GeoCities or Myspace or even farther back—but what if we could feel nostalgic for the missed opportunities, for the forks in the road that could have gone a different way, for the the points in history in which privatization was deepened when the internet could have evolved in a different channel? Then perhaps nostalgia could be an aid to the social movements that will be necessary in order to deprivatize and democratize the internet.

We use the internet in the privacy of our bedrooms, or in the glow of our smartphones. What if our experience of the internet could be a more collective one, and one that brought us into relationships of solidarity and mutual support with other people in our community? So to that end, I think what the Equitable Internet Initiative is doing could provide a promising starting point for thinking about connecting differently through the internet.

In this essay a new form of Internet activism is proposed: stacktivism. Building on hacktivist practices, this form of code and standard development as political struggle is envisioned to connect different layers of the techno-protological stack (also known as the Internet) in order build bridges between different, still isolated institutional levels and disciplinary practices such as grassroots wifi-access initiatives, interface design, geeks, computer scientists and governance experts. How do we envision a public stack that goes beyond the structures such ICANN, IETF and IGF that can take up the task to rebuild the Internet as a decentralized, federated, public infrastructure?

cables submarinos canarias 2015

Los cables submarinos en Canarias que conectan las islas entre sí son los siguientes. Entre paréntesis se indica el año de puesta en servicio:

  • Cables interinsulares de Telefónica: salvo en el caso de El Hierro, su configuración en anillos evita que una isla quede desconectada por la rotura de un cable.
    • Transcan 2 – S1: Gran Canaria – Fuerteventura (1990)
    • Transcan 2 – S2: Fuerteventura – Lanzarote (1990)
    • Pencan 5 – S2 (TFE-GC): Tenerife – Gran Canaria (1992)
    • Tegopa – S1: Tenerife – La Gomera (1995)
    • Tegopa – S2: La Gomera – La Palma (1995)
    • Candalta 1: Tenerife – Gran Canaria (1999)
    • Transcan 3: Gran Canaria – Lanzarote (1999)
    • Telapa: Tenerife – La Palma (2004)
    • Gomera-Hierro: La Gomera – El Hierro (2007)
    • Candalta 2: Tenerife – Gran Canaria (2010)
  • Cable Submarino de Canarias: dispone de dos cables submarinos entre Tenerife y Gran Canaria (2002).
  • Canalink: dispone de dos cables submarinos entre Tenerife y Gran Canaria y uno entre Tenerife y La Palma (2011).

 

Además, las islas están conectadas con el exterior por los siguientes sistemas:

  • Telefónica dispone de tres cables submarinos, dos conectan Cádiz con Tenerife (PENCAN 6 y PENCAN 8) y el tercero (PENCAN 7) con Gran Canaria. Los dos últimos fueron ampliados en 2015 con tecnología 100G.
  • Canalink dispone de un sistema de cable submarino que conecta Tenerife con Cádiz; este cable dispone de un ramal que conecta con Marruecos.
  • La isla de Tenerife está conectada al sistema ACE (Africa Coast to Europe), consorcio con participación de Orange.
  • La isla de Gran Canaria está conectada al sistema WACS (West African Cable System), consorcio con participación de Vodafone.

the telecom operator locates the damaged area by zeroing in on the problematic part. To do this, they send signal pulses through the cable from one end or base station. The damaged area (break) will bounce back the pulse to the signalling site which sent the data. Calculating the time delay from the reflected signal, engineers can zero in on the exact point and area of the problem. Then they send out a large cable repair ship, with fresh optic cables to replace the defective part under the sea. The cable is then lifted from the sea bed using special hooks (grapnel) and dragged onto the ship. The faulty cables are then spliced onboard, joined with a fresh cable, sealed with a watertight and anticorrosive covering and returned back to the seabed. Then the ship sends out the information to the base stations to test the cable again. Once the test signals are reaching the destination, the work is confirmed and the cable is repaired successfully. The data is then switched on and the connection restored. The entire process can take up to 16 hours or more, depending the number of cable breaks, successful repairs, time of day, weather conditions at sea and ships around the area.

Climate disaster could take away both their connection and a crucial source of income. And, ironically enough, these cables are used to help gather data on climate impacts, so climate change could mess up the very tools we need to monitor the impact of climate change.

Captura de pantalla de OpenArchive

Now, more than ever, capturing media on mobile phones plays a key role in exposing global injustice.

OpenArchive helps individuals around the world to securely store and share the critical evidence they’ve captured.

Save
Share · Archive · Verify · Encrypt

Save is a new mobile media app designed by OpenArchive to help citizen reporters and eyewitnesses around the world preserve, protect, and amplify what they’ve documented.

My goal is neither to eliminate the powerful internet platforms nor to cede the future to them – it is to imagine possible futures where surveillant advertising delivered by monopoly providers isn’t the only available option to build a thriving future of democratic communications.

From the moment radio became practically possible to the moment it became a powerful cultural force is roughly fifteen years, from 1912 to 1927. What occurred in those fifteen years was a gold rush that resembles the late 1990s internet boom in its passion and energy but differs sharply in the diversity of models pursued.

The BBC had several enormous advantages over their U.S. counterparts. Not only did it have an enviable monopoly, it had a guaranteed revenue stream from the annual license fees levied on each radio receiver sold.

Again, the lesson is that a particular business model is not inevitable but the product of political, economic and cultural forces.

One country where Google and Facebook have very little power and influence is China, where government censorship, designed to control online expression, had the interesting side effect of protecting China’s domestic internet market from foreign competitors. While many Chinese dissidents, journalists, and fans of western movies became skilled at “jumping the Great Firewall,” China’s domestic market and linguistic isolation were significant enough to enable a rich and complex local internet ecosystem.

While the server software that operates much of the World Wide Web is open-source software, as is the Firefox web browser, the online content and services business is dominated by the U.S. and Chinese models, with Wikipedia as the sole noncommercial site in the worldwide top 100 sites.

Wikimedia is a form of public service media, though it resembles the role of public radio in the United States, which is supported by a mix of listener donations and commercial sponsorship rather than a license fee, as in the BBC model. Its decisions are driven by a set of articulated and well-debated values about access to knowledge and information and not by market signals. That it is able to survive without government support or a license fee is not an argument against public support for media – instead, it’s an open invitation to ask what other services we could build if we innovated outside the logic of markets more often.

Jeudi, nous rapportions qu’avec sa décision Tobias Mc Fadden prise pour une affaire de piratage de fichiers MP3, la Cour de justice de l’Union européenne (CJUE) a véritablement condamné à mort les réseaux Wi-Fi ouverts, en exigeant que les professionnels qui offrent un tel service recueillent l’identité des internautes qui s’y connectent, et conservent un journal de leurs connexions. Ceux qui ne le font pas s’exposeront à des conséquences financières, alors-même que la Cour estime qu’ils ne sont pas responsables des téléchargements illégaux effectués avec leur connexion.

La Cour était interrogée par la justice allemande au sujet d’un gestionnaire d’une boutique de sons et lumières, qui proposait un accès Wi-Fi gratuit et ouvert à tous ses clients, sans le sécuriser contre le téléchargement sur les réseaux Peer-to-Peer (P2P). Sony Music demandait que le commerçant soit tenu civilement responsable des téléchargements illégaux de fichiers MP3 réalisés par des tiers à travers cette connexion, et qu’il lui soit fait obligation de sécuriser le réseau Wi-Fi.

Yet in the twenty-first century, power will be determined not by one’s nuclear arsenal, but by a wider spectrum of technological capabilities based on digitization. Those who aren’t at the forefront of artificial intelligence (AI) and Big Data will inexorably become dependent on, and ultimately controlled by, other powers. Data and technological sovereignty, not nuclear warheads, will determine the global distribution of power and wealth in this century. And in open societies, the same factors will also decide the future of democracy.

The most important issue facing the new European Commission, then, is Europe’s lack of digital sovereignty. Europe’s command of AI, Big Data, and related technologies will determine its overall competitiveness in the twenty-first century. But Europeans must decide who will own the data needed to achieve digital sovereignty, and what conditions should govern its collection and use.

En los últimos años hemos visto cómo caían uno a uno los mitos tecnooptimistas. Internet no se autorregula, no es neutro, no es fiable. Probablemente ni siquiera sea bueno para nuestros cerebros. Para mí, también cae el mito de que gracias a él puedes vivir donde quieras, de que ayudará a romper la brecha campo-ciudad. No solo no está quitando presión de las ciudades, sino que está rematando la falta de infraestructuras rurales. Es una más de las mil cosas que faltan. Debía ayudar a poblar, pero su ausencia contribuye a despoblar.

Mientras zonas enormes de España están despobladas todo sigue empujándonos a las ciudades, alimentando aún más los extrarradios que ya engordaron nuestros antepasados. Mientras permitimos que políticos y medios hablen únicamente de polémicas que solo interesan en Madrid y Barcelona, en la España vacía nada está garantizado. Lo único seguro es que cada año son menos habitantes y que los servicios que se pierden ya no se recuperan. Igual cuando acabe de llegar Internet ya no queda nadie allí para usarlo.