Hello 👋 there! This is the second issue of NAND Circuit, a weekly newsletter that briefs you about the latest news and happenings in the tech world, emerging technologies, and the companies that run it. I hope you have as much fun reading as I did writing this! If you like this newsletter considering sending it to a friend who might find it interesting. Cheers 🥂 !
In this issue,
- 🖼 A screenshot sells for $69 million
- 🤖 OpenAI’s state-of-the-art ML vision model’s weird predictions
- 🛳 Scientists want to send Noah’s ark to the Moon
- 👀 Lay’s Potato Chips releases a chrome plugin
- 🍕 Quick Bytes
A screenshot sells for $69 million 💵
A collage of 5000 images, titled “Everydays — The first 5000 days” by the digital artist Beeple, was sold at an online auction for a whopping $69.3 million! Making it the third-highest price achieved by a living artist.
Okay, but anyone can just save this image right? Why would people buy it?
Well, it is for the same reason you can save a picture of the Mona Lisa on your phone and still never truly own it. This artwork was minted into what are called Non-Fungible Tokens(NFTs). Or I like to call Season 3 of the Blockchain Revolution.
What are NFTs?
NFTs are tokens that are used to represent ownership of unique items. This can include a variety of things like art, collectables or even real estate. Anything from the digital to the physical realm can be tokenised.
An NFT can only have one official owner at a time. And no one can modify the record of ownership or duplicate a new NFT into existence. But the creator when tokenising the asset can specify its scarcity. It can be 1/1 or even a limited edition collectables. And it also possible for the creator to earn royalties every time it is sold on. And all of this made possible by Smart Contracts on the Ethereum blockchain.
Theoretically, if I tokenise this newsletter, I can sell every edition as a new NFT token. And if you buy it, you get a unique random set of characters that prove you own it. And no one can change that. You can sell it to someone else. And each time you sell, if I enabled royalties, I get a cut. All of this is completely decentralised and platform independent (the platform you choose to sell your NFTs on). And that’s NFTs in a nutshell.
NFTs and Ethereum are solving real problems that exist on the internet today. As everything becomes more digital, there is no reliable place on the internet where artists or creators can immutably link to their digital creations. We need to protect originality. And by doing so, we can empower it.
Is the NFT craze a revolution or bubble? Only time will tell. But this technology is a much welcomed solution to the authentication and provenance problem faced by the art world. So what asset are you tokenising?
OpenAI’s State-of-the-art ML vision Model can be fooled with a pen 🖊 and paper 📄
OpenAI’s State-of-the-art Machine Learning(ML) model should convince you that the dystopian AI envisioned by Terminator isn’t coming any time soon. OpenAI announced CLIP in January earlier this year. It is a neural network that efficiently learns visual concepts from natural language supervision. This means that every image that it learns from has a text attribute that describes what it is. OpenAI used around 400 million image-text pairs scraped from the internet to train CLIP.
During its testing, researchers at OpenAI discovered “multimodal neurons” — individual components in the machine learning network that respond not only to images of objects but also sketches, cartoons, and associated text which can be presented literally, symbolically, or conceptually. See the image below.
This is groundbreaking because this is how the human brain responds to stimuli. And it is also the reason why you’re asked to take the “Prove you’re human” captchas because machines generally can’t infer from random data without prior context.
But earlier this month, OpenAI reported on some shortcomings of CLIP. Turns out because of its unusual architectural implementation, the model can be tricked into producing erroneous predictions. By exploiting the CLIP’s ability to read text robustly, they found that even photographs of hand-written text can often fool the model.
The same ability that allows the program to link words and images at an abstract level creates this very weakness. OpenAI calls it the “fallacy of abstraction”.
CLIP’s multimodal neurons have been found to generalise across the literal and the iconic meaning of an image. For example, this cute picture of a poodle can be erroneously predicted as a piggy bank (doggy bank 🐶) by overlaying ‘$$$’ on it. The dollar sign activates the finance neuron, which usually responds to piggy banks and the ‘$’ sign. And the abstraction tricks a state-of-the-art model.
OpenAI also reported on the many unchecked biases and associations that CLIP inherited (well it did source its data from the internet). They found that a “Middle East” neuron has an association with terrorism. An “immigration” neuron that responds to Latin America. And also a neuron that fires for both dark-skinned people and gorillas. These associations are both harmful and unacceptable made by either everyday people or machines. We must have systems in place to check biases in AI and as well as our own conscience.
While a child would’ve distinguished between an iPod and a sticker with the text ‘iPod’, generalised ML models just can’t do that yet. But considering how far we have come, it might just be around the corner.
Scientists want to send Noah’s ark to the Moon
Scientists are pulling a ‘Noah's Ark’ in a new lunar proposal that they call a global insurance policy in preparation for the likelihood of ‘The Apocalypse’.
There already exits a doomsday vault housing plant seeds at the Svalbard Global Seed Vault in Norway. But this one is not just for the flora but for the fauna too.
Well, unlike the ark, instead of 2 of every animal, the solar-powered moon ark would cryogenically store frozen seed, spore, sperm and egg samples from some 6.7 million Earth species, including humans. 50 from each of the 6.7 million species, which totals to around 335 million.
The proposed bank or ‘Ark’, would be beneath the moon’s surface, which would help shield it from solar radiation, meteors or temperature changes on the surface. Transporting 335 million samples to the moon would take about 250 rocket launches 🚀 (it took 40 rocket launches to build the International Space Station).
Conversations like this have a weird way of making us feel like we’re a part of a distant future. And I, for one, am here for it. To the moon 🚀!
Lay’s new chrome plugin will turn on subtitles when you start eating chips
Not that anyone asked for it but, Lay’s launched a new plugin that’ll turn on subtitles the exact moment you start eating chips.
Lays’s in this spoof claims, “ When you’re crunching and munching, it can be hard to hear the audio. And Lays is the crispiest chip of all.”
Lay’s apparently trained an AI on 178 hours of crisp chip crunch sounds gathered from all over the world. It uses “crisp-sound-recognition” to know when to exactly when you bite into a chip. And it’ll automatically turn the subtitles on. This is to of course ensure users don’t miss any part of the video while eating chips.
Well, Lay’s, how about next time you create a plugin to determine how much air is in each Lay’s chip bag? (Hint: about 40% 🙃).
🍕 Quick Bytes
- Netflix is reportedly testing a feature to limit password-sharing by asking viewers if they “live with the account holder” amid heavy competition in the “streaming wars”.
- Researchers at Helsinki have developed a new GAN to identify and produce faces you find attractive. (What does this mean for the future of online dating?)
- Google faces lawsuit for tracking users in incognito mode.
- Tinder will soon let you run a background check on a potential date.
- SpaceX Falcon 9 rocket launches for a record ninth time bringing 60 more Starlink satellites into orbit.
That’s all from me this week! If you liked this issue, forward it to your friends who might find it interesting. Bye bye 👋