Autonomía digital y tecnológica

Código e ideas para una internet distribuida

Linkoteca. cultura tecnológica

Before implementing a captcha, it’s worth considering if one is necessary to begin with. To help with evaluating this proposition, consider if your threat model is concerned over customized or uncustomized spam. Uncustomized spam is pervasive across many Internet protocols, and you will encounter it quickly after enabling HTTP, SSH, or many other protocols on a server. It is generally unintelligent, cheap to execute, and easy to block, even without captchas. Customized spam, however, is spam that has been written to specifically affect a given company, service, website, or user. As customized spam is created by an actor that is able to tailor it to your service, it is more dangerous than uncustomized spam, and more effort is required to effectively limit it.

Just because someone could spend hours (or minutes) writing a program to spam your website does not mean that someone will. Your personal blog about the latest vegan bacon is not a high-priority target for anyone. Adding a ReCAPTCHA to your Contact Me page is just a great way to get no one to talk to you. I’ve ran several websites with millions of pageviews that have received zero customized abuse and have spoken to other webmasters with similar experiences. Jeff Atwood of once wrote similarly:

The comment form of my blog is protected by what I refer to as “naive captcha”, where the captcha term is the same every single time. This has to be the most ineffective captcha of all time, and yet it stops 99.9% of comment spam.

This is not a suggestion to do nothing, ignore basic security, and be unprepared for attacks, but rather to realistically consider your threat model and apply only what is necessary.

Apple users are seen as the ‘invisible poor’ – those who do not look as poor as their financial circumstances.

Apple iPhone users in China are generally less educated, hard-up and with few valuable assets, compared to users of other mobile phone brands such as Huawei or Xiaomi, according to a report by research agency MobData.

Flowchart to know if a machine is an AI.

In the broadest sense, AI refers to machines that can learn, reason, and act for themselves. They can make their own decisions when faced with new situations, in the same way that humans and animals can.

As it currently stands, the vast majority of the AI advancements and applications you hear about refer to a category of algorithms known as machine learning. These algorithms use statistics to find patterns in massive amounts of data. They then use those patterns to make predictions on things like what shows you might like on Netflix, what you’re saying when you speak to Alexa, or whether you have cancer based on your MRI.

Se trata de la residencia digital (o e-residency, en inglés), con la que cualquier persona puede hacerse residente de este pequeño país de 1,3 millones de habitantes sin necesidad de vivir allí.

La residencia digital no aporta la ciudadanía ni exime de la necesidad de la visa. El objetivo es que los nuevos residentes contribuyan a la economía del país utilizando sus bancos y abriendo empresas.

La residencia digital, en cambio, permite a los emprendedores abrir su negocio sin necesidad de contar con un director local en el país, así como acceder a los servicios bancarios y de pagos online, declarar los impuestos a través de internet o firmar documentos sin poner pie en Estonia.

La iniciativa se lanzó en 2014 y ya hay más de 20.000 residentes digitales. En total, manejan unas 3.000 empresas, según datos oficiales.

So instead of considering the practical ethics of impoverishing and exploiting the many in the name of the few, most academics, journalists, and science-fiction writers instead considered much more abstract and fanciful conundrums: Is it fair for a stock trader to use smart drugs? Should children get implants for foreign languages? Do we want autonomous vehicles to prioritize the lives of pedestrians over those of its passengers? Should the first Mars colonies be run as democracies? Does changing my DNA undermine my identity? Should robots have rights?

Asking these sorts of questions, while philosophically entertaining, is a poor substitute for wrestling with the real moral quandaries associated with unbridled technological development in the name of corporate capitalism. Digital platforms have turned an already exploitative and extractive marketplace (think Walmart) into an even more dehumanizing successor (think Amazon). Most of us became aware of these downsides in the form of automated jobs, the gig economy, and the demise of local retail.

But the more devastating impacts of pedal-to-the-metal digital capitalism fall on the environment and global poor. The manufacture of some of our computers and smartphones still uses networks of slave labor. These practices are so deeply entrenched that a company called Fairphone, founded from the ground up to make and market ethical phones, learned it was impossible. (The company’s founder now sadly refers to their products as “fairer” phones.)

Meanwhile, the mining of rare earth metals and disposal of our highly digital technologies destroys human habitats, replacing them with toxic waste dumps, which are then picked over by peasant children and their families, who sell usable materials back to the manufacturers.

Hirikilabs pretende ser un espacio en el que reflexionar sobre por qué, cómo y para qué utilizamos la tecnología y experimentar en torno a ello, un espacio donde la tecnología no es un objetivo sino un camino, donde pasa de ser el centro del proceso a ser parte (importante) del proceso.

…tiene el objetivo de analizar la tecnología desde varias perspectivas y trabajar de manera crítica las relaciones que se crean a partir de ella, haciendo uso de diversos proyectos, grupos de trabajo y procesos, y garantizando distintas maneras de acercamiento. Y es que creemos que en ese sentido promovemos una reflexión crítica hacia la tecnología entre la ciudadanía. Así, entendemos que la mirada crítica y el dominio de las herramientas y de las dinámicas de difusión son el camino hacia el empoderamiento.

Entendemos la soberanía como un proceso de apropiación y comprensión de las tecnologías por parte de los ciudadanos que contribuye a una sociedad más y mejor informada y con cierta capacidad de autoprotección ante los retos derivados de una cada vez mayor convivencia con la tecnología. La irrupción en la vida diaria de la inteligencia artificial, la automatización, la privacidad en la red o el control son solo algunas de las cuestiones ante las que antes o después la sociedad tendrá que tomar partido. La apertura del conocimiento y el uso de software y hardware libres, contribuyen a comprender las implicaciones y saber que al menos existen alternativas.

Your big questions about the future answered. How science will influence and change our lives. Britt Wray and Ellie Cosgrave present a fortnightly investigation of a hot science topic in about 30 minutes. The Tomorrow’s World podcast will begin a second run of episodes in early 2018.

Data on waste generation typically separate producer wastes, such as those from mining, and consumer wastes such as those from households. But there are problems with such division.

It makes the mistake of thinking producer waste and consumer waste are two separate things instead of flip sides of the same coin in industrial systems. It also makes the mistake of presuming consumers have much in the way of meaningful choice in what their electronics are made of.

Electronics contain a wide variety of materials. One important example is copper. The electronics industry is the second-largest consumer of copper. Only the building and construction sector uses more.

Post-consumer recycling of electronics will never be enough, we need to be able to repair — and upgrade — the devices we already have, if we are to slow our production of e-waste.

In the U.S., the Repair Association is doing the hard work of advocating for consumers to have the right to repair the devices they purchase by enshrining those rights into law. That said, an e-waste recycler in California now faces a 15-month prison sentence and a US$50,000 fine in his efforts to extend the lives of computers.

The automobile, food and pharmaceutical industries have to show their products meet certain safety standards before they are put on the market. Why not demand the same of the electronics industry?

As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a «filter bubble» and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.

Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google «BP,» one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. Presumably that was based on the kinds of searches that they had done in the past. If you have Google doing that, and you have Yahoo doing that, and you have Facebook doing that, and you have all of the top sites on the Web customizing themselves to you, then your information environment starts to look very different from anyone else’s. And that’s what I’m calling the «filter bubble»: that personal ecosystem of information that’s been catered by these algorithms to who they think you are.

Una burbuja de filtros es el resultado de una búsqueda personalizada en donde el algoritmo de una página web selecciona, a través de predicciones, la información que al usuario le gustaría ver basado en información acerca del mismo (como localización, historial de búsquedas, y elementos a los que les dio clic en el pasado) y, como resultado, los usuarios son alejados de la información que no coincide con sus puntos de vista, aislándolos efectivamente en burbujas ideológicas y culturales propias del usuario.

Un ejemplo son los resultados de la búsqueda personalizada de Google y el hilo de noticias personalizadas de Facebook . El término fue acuñado por el ciberactivista Eli Pariser en su libro que tiene el mismo nombre; de acuerdo a Pariser, los usuarios son menos expuestos a puntos de vista conflictivos y son aislados intelectualmente en su propia burbuja de información. Pariser relata un ejemplo en donde el usuario hace una búsqueda en Google para «BP» y tiene como resultado noticias acerca de British Petroleum mientras que otra persona obtuvo información acerca del derrame de petróleo Deepwater Horizon y que los dos resultados de búsqueda fueron muy diferentes entre ellas.

Según Mark Zuckerberg: «Saber que una ardilla muere en tu jardín puede ser más relevante para tus intereses que saber que muere gente en África».

social metaphors tend to communicate more about the values of society rather than the technology of the Internet itself

The electronic frontier metaphor conceptualizes the Internet as a vast unexplored territory, a source of new resources, and a place to forge new social and business connections.