neurodiversitysci:

dexer-von-dexer:

danshive:

In science fiction, AIs tend to malfunction due to some technicality of logic, such as that business with the laws of robotics and an AI reaching a dramatic, ironic conclusion.

Content regulation algorithms tell me that sci-fi authors are overly generous in these depictions.

“Why did cop bot arrest that nice elderly woman?”

“It insists she’s the mafia.”

“It thinks she’s in the mafia?”

“No. It thinks she’s an entire crime family. It filled out paperwork for multiple separate arrests after bringing her in.”

I have to comment on this because this is touching on something I see a lot of people (including Tumblr staff and everyone else who uses these kind of deep learning systems willy-nilly like this) don’t quite get: “Deep Reinforcement Learning” AI like these engage with reality in a fundamentally different way from humans. I see some people testing the algorithm and seeing where the “line” is, wondering whether it looks for things like color gradients, skin tone pixels, certain shapes, curves, or what have you. All of these attempts to understand the algorithm fail because there is nothing to understand. There is no line, because there is no logic. You will never be able to pin down the “criteria” the algorithm uses to identify content, because the algorithm does not use logic at all to identify anything, only raw statistical correlations on top of statistical correlations on top of statistical correlations. There is no thought, no analysis, no reasoning. It does all its tasks through sheer unconscious intuition. The neural network is a shambling sleepwalker. It is madness incarnate. It knows nothing of human concepts like reason. It will think granny is the mafia.

This is why a lot of people say AI are so dangerous. Not because they will one day wake up and be conscious and overthrow humanity, but that they (or at least this type of AI) are not and never will be conscious, and yet we’re relying on them to do things that require such human characteristics as logic and any sort of thought process whatsoever. Humans have a really bad tendency to anthropomorphize, and we’d like to think the AI is “making decisions” or “thinking,” but the truth is that what it’s doing is fundamentally different from either of those things. What we see as, say, a field of grass, a neural network may see as a bus stop. Not because there is actually a bus stop there, or that anything in the photo resembles a bus stop according to our understanding, but because the exact right pixels in the photo were shaded in the exact right way so that they just so happened to be statistically correlated with the arbitrary functions it created when it was repeatedly exposed to pictures of bus stops over and over. It doesn’t know what grass is, what a bus stop is, but it sure as hell will say with 99.999% certainty that one is in fact the other, for reasons you can’t understand, and will drive your automated bus off the road and into a ditch because of this undetectable statistical overlap. Because a few pixels were off in just the right way in just the right places and it got really, really confused for a second.

There, I even caught myself using the word “confused” to describe it. That’s not right, because “confused” is a human word. What’s happening with the AI is something we don’t have the language to describe.

Anyway what’s more, this sort of trickery can be mimicked. A human wouldn’t be able to figure it out, but another neural network can easily guess the statistical filters it uses to identify things and figure out how to alter images with some white noise in exactly the right way to make the algorithm think it’s actually something else. It’ll still look like the original image, just with some pixelated artifacts, but the algorithm will see it as something completely different. This is what’s known as a “single pixel attack.” I am fairly confident porn bot creators might end up cracking the content flagging algorithm and start putting up some weirdly pixelated porn anyway, and all of this will be in vain. All because Tumblr staff decided to rely on content moderation via slot machine.

TL;DR bots are illogical because they’re actually unknowable eldritch horrors made of spreadsheets and we don’t know how to stop them or how they got here, send help

This is such an accurate description of machine learning. Sadly, it’s also the best computational model we have of how babies learn words.

Tumblr recently clarified that nudity is acceptable in art, descriptions of breastfeeding and childbirth, and other non-porn uses. As they should. But don’t let that lull you into a false sense of security. They CAN’T keep their promise using machine learning alone – certainly not with crappy algorithms like “look for skin tones and curves.” Distinguishing porn from simple nudity is a somewhat subjective, culturally-based tasks that challenges smart humans. No set of statistical patterns, however sophisticated, can make that judgment.

College courses of the future, courtesy of a neural network

lewisandquark:

There are a lot of strange courses that make it into a college course catalog. What would artificial intelligence make of them?

I train machine learning programs called neural networks to try to imitate human things – human things they are absolutely are not prepared to understand. I’ve trained them to generate paint colors (Shy Bather or Stanky Bean, anyone?) and cat names (Mr. Tinkles is very affectionate) and even pie (please have a slice of Cromberry Yas). Could it have similar “success” at inventing new college courses?

UC San Diego’s Triton alumni magazine gave me UCSD’s entire course catalog, from “A Glimpse into Acting” to “Zionism and Post Zionism”, a few of which I recognized from when I was a grad student at UCSD. (Apparently I totally missed my opportunity to take “What the *#!?: An uncensored introduction to language”) I gave the course catalog to a neural network framework called textgenrnn which took a look at all the existing courses and tried its best to figure out how to make more like them.

image

It did come up with some intriguing courses. I’m not sure what these are, but I would at least read the course description.

Strange and Modern Biology
Marine Writing
General Almosts of Anthropology
Werestory
Deathchip Study
Advanced Smiling Equations
Genies and Engineering
Language of Circus Processing
Practicum Geology-Love
Electronics of Faces
Marine Structures
Devilogy
Psychology of Pictures in Archaeology
Melodic Studies in Collegine Mathematics

These next ones definitely sound as if they were written by a computer. Since this algorithm learns by example, any phrase, word, or even part of word that it sees repeatedly is likely to become one of its favorites. It knows that “istics” and “ing” both go at the end of words. But it doesn’t know which words, since it doesn’t know what words actually mean. It’s hard to tell if it’s trying to invent new college courses, or trying to make fun of them.

Advanced Computational Collegy
The Papering II
The Special Research
Introduction to Oceanies
Biologrative Studies
Professional Professional Pattering II
Every Methods
Introduction study to the Advanced Practices
Computer Programmic Mathematics of Paths
Paperistics Media I
Full Sciences
Chemistry of Chemistry
Internship to the Great
The Sciences of Prettyniss
Secrets Health
Survivery
Introduction to Economic Projects and Advanced Care and Station Amazies
Geophing and Braining
Marine Computational Secretites

It’s anyone’s guess what these next courses are, though, or what their prerequisites could possibly be. At least when you’re out looking for a job, you’ll be the only one with experience in programpineerstance.

Ancient Anthlographychology
Design and Equilitistry
The Boplecters
Numbling Hiss I
Advanced Indeptics and Techniques
Introduction in the Nano Care Practice of Planetical Stories
Ethemishing Health Analysis in Several Special Computer Plantinary III
Field Complexity in Computational Electrical Marketineering and Biology
Applechology: Media
The Conseminacy
The Sun Programpineerstance and Development
Egglish Computational Human Analysis
Advanced A World Globbilian Applications
Ethrography in Topics in the Chin Seminar
Seminar and Contemporary & Archase Acoa-Bloop African Computational for Project
Laboration and Market for Plun: Oceanography

Remember, artificial intelligence is the future! And without a strong background in Globbilian Applications, you’ll be left totally behind.

Just to see what would happen, I also did an experiment where I trained the neural net both on UCSD courses and on Dungeons and Dragons spells. The result was indeed quite weird. To read that set of courses (as well as optionally to get bonus material every time I post), enter your email here.

therealpeaches:

This was posted by a bot so this is the most meta thing I have ever seen. I am legit 90% certain this bot has attained sentience bc I’ve been following it for a while and it had posted quite a few coherent jokes, despite being shitpostbot5k. And there’s a certain amount of non-coherent jokes that make me thing it’s not a human behind this. It’s like a reverse Turing test.

A neural network tries writing the first sentence of a novel

lewisandquark:

It’s National Novel Writing Month (NaNoWriMo, for short), which means that writers everywhere are embarking on writing projects – and when you’re faced with a blank page, sometimes it’s just hard to get started.

I wanted to see if I could train a computer program to help. I train computer programs called neural networks to imitate all kinds of human things, from paint colors to Dungeons and Dragons spells to Harry Potter fan fiction to Halloween costumes. All I have to do is give the neural network a long list of examples and it will try its best to teach itself to generate more like them. 

So, I decided to give a neural network examples of first sentences of novels, to see if it could generate some that might help writers get started. The main problem turned out to be finding enough examples of first sentences – ideally, I need thousands. I could only find a couple hundred of the most famous lines, and the neural network proceeded to do what it usually does when faced with too little data, which is to give up on trying to understand what’s going on, and instead just try to read it back to me word for word. Think of it like cramming for a test by memorizing instead of learning how to apply rules to solve problems.

So, this is typical of what it generated:

The snow in the story of the best of times, it was the season of Darkness, it was the season of Light, it was the epoch of belief, it was the worst of times, it was the season of Light, it was the season of Darkness, it was the season of Light, it was the season of Light, it was the season of Darkness, it was the season of exploding past of Eller, and Junner, a long sunset side of the World.

It was a dark and stormy night; the rain fell in torrents — except the station steps; plump Buck Mulligan came from the stairhead, bearing a bowl of people.

Most didn’t make much sense, and/or were obvious mishmashes of famous lines. A few turned out to be maybe usable, probably by accident:

There was a man and he had seventy first sight.

It is a truth universally acknowledged, that a single man in possession of a good fortune must be in want of my life, fire of my loins.

4 Had come to America from Europe Privet Drive.

The snow is gone sometime, and you said, Why, and I said, To be with the darkness.

It was like the imagination.

It was a wrong number that struggled against the darkness.

It was a dark and stormy night; the swall of the gods?

The moon turned out to see me.

It was a wrong number four Privet Drive.

That’s good thinking: a bowl of the carriage’s parts.

The sky above the present century had reached the snapping point.

Mrs. Can is sitting in the World.

The sheriff returned to the darkness.

It was a wrong number that can never see through a blue-eyed type like me.

I was born in the darkness.

I shall turn to the pop-holes.

(Very minor punctuation edits by me: an “is” here, a semicolon there). 

Clearly, the neural network needed help. Where could I get it more data? My searching sent me, unwisely, as it turned out, to the site of the Bulwer-Lytton Fiction Contest, which has over 900 archived first sentences of hypothetical novels. The problem is that it’s a contest to write the worst first sentence. One of the honorable mentions from 2017 was the following:

As he lay dying on the smoke-wreathed battlefield, General Winthrop finally realized the terrible toll the war had taken, and he wondered if the bloodshed had all been for naught as he exhaled his last breath in a sort of “meoooooh,” actually very similar to the sound his cat Mister Jingles made when he wanted some food or was doing that thing with the drapes. – Mike Christensen, Washington, DC

I added them all. It didn’t help.

Stop! I caused the Narguuse man who was new on Alabama, the screaming constipated eggs.

I am an angry grass, the symposium square, proved fatal to the throbbing, the howling wind tire…

The beans suddenly with him in the trunk of an out-of-balance has really dead, then all the time hammered his head in abject puzzlement as a bang, and a head tuxedo-failed law of ghansmothered eyes like a fine that the hell of her supposed by the rain flare of the waterhole where it is in a long was mad.

I have to stop that in the sidewalk aliens while your hands after he had to go in the top of the day a new work our eyes of the pumpkin but stands over another meaning in shortered to the sea, beautifickinary to be like that.

The crust shark began to pull up a small indent directions of the dead old dried and spect of the grassy sure and closed by the same stormy wind – they were always together.

It was a dark and stormy night and the secret being a silver-backed gorilla.

Would you like to help the neural network improve? I successfully crowdsourced a dataset for the Halloween costumes (and have an awesome post coming up soon on some crowdsourced D&D character backstories).

Go to this form (no email necessary) and enter the first line of your novel, or your favorite novel, or of every novel on your bookshelf. You can enter as many as you like. At the end of the month, I’ll hopefully have enough sentences to give this another try.

A neural network designs Halloween costumes

lewisandquark:

image

It’s hard to come up with ideas for Halloween costumes, especially when it seems like all the good ones are taken. And don’t you hate showing up at a party only to discover that there’s *another* pajama cardinalfish?

I train neural networks, a type of machine learning algorithm, to write humor by giving them datasets that they have to teach themselves to mimic. They can sometimes do a surprisingly good job, coming up with a metal band called Chaosrug, a craft beer called Yamquak and another called The Fine Stranger (which now exists!), and a My Little Pony called Blue Cuss.

So, I wanted to find out if a neural network could help invent Halloween costumes. I couldn’t find a big enough dataset, so I crowdsourced it by asking readers to list awesome Halloween costumes. I got over 4,500 submissions.

The most popular submitted costumes are the classics (42 witches, 32 ghosts, 30 pirates, 22 Batmans, 21 cats (30 incl sexy cats), 19 vampires, and 17 each of pumpkins and sexy nurses). There are about 300 costumes with “sexy” in their names; some of the most eyebrow-raising include sexy anglerfish, sexy Dumbledore, sexy golden pheasant, sexy eyeball, sexy Mothra, Sexy poop emoji, Sexy Darth Vader, Sexy Ben Franklin, Sexy TARDIS, Sexy Cookie Monster, and Sexy DVORAK keyboard. In the “technical challenge” department, we have costumes like Invisible Pink Unicorn, Whale-frog, Glow Cloud, Lake Michigan, Toaster Oven, and Garnet.

All this is to say that humans are very creative, and this task was going to be tricky for a neural network. The sensible approach would be to try to use a neural network that actually knows what the words mean – there are such things, trained by reading, for example, all of Google News and figuring out which words are used in similar ways. There’s a fun demo of this here. It doesn’t have an entry for “Sexy_Gandalf” but for “sexy” it suggests “saucy” and “sassy”, and for “Gandalf” it suggests “Frodo”, “Gollum”, and “Voldemort”, so you could use this approach to go from “Sexy Gandalf” to “Sassy Voldemort”. 

I wanted something a bit weirder. So, I used a neural network that learns words from scratch, letter by letter, with no knowledge of their meaning, an open-source char-rnn neural network written in Torch. I simply dumped the 4500 Halloween costumes on it, and told the neural network to figure it out.

Early in the training process, I decided to check in to see how it was doing.

Sexy sexy Dombie Sexy Cat
Sexy A stare Rowan
Sexy RoR A the Rog
Sexy Cot
Sexy Purbie Lampire
Poth Rat
Sexy Por Man
The Wombue
Pombie Con A A Cat
The Ran Spean Sexy Sexy Pon Sexy Dander
Sexy Cat
The Gull Wot
Sexy Pot
Hot

In retrospect, I should have expected this. With a dataset this varied, the words the neural network learns first are the most common ones.

I checked in a little later, and things had improved somewhat. (Omitted: numerous repetitions of “sexy nurse”). Still the only thing that makes sense is the word Sexy.

Sexy The Carding Ging
Farbat of the Cower
Sexy The Hirler
A costume
Sexy Menus
Sexy Sure
Frankenstein’s Denter
A cardian of the Pirate
Ging butter
Sexy the Girl Pirate

By the time I checked on the neural network again, it was not only better, but astoundingly good. I hadn’t expected this. But the neural network had found its niche: costume mashups. These are actually comprehensible, if a bit hard to explain:

Punk Tree
Disco Monster
Spartan Gandalf
Starfleet Shark
A masked box
Martian Devil
Panda Clam
Potato man
Shark Cow
Space Batman
The shark knight
Snape Scarecrow
Gandalf the Good Witch
Professor Panda
Strawberry shark
Vampire big bird
Samurai Angel
lady Garbage
Pirate firefighter
Fairy Batman

Other costumes were still a bit more random.

Aldonald the Goddess of the Chicken
Celery Blue Frankenstein
Dancing Bellyfish
Dragon of Liberty
A shark princess
Statue of Witch
Cupcake pants
Bird Scientist
Giant Two butter
The Twin Spider Mermaid
The Game of Nightmare Lightbare
Share Bat
The Rocky Monster
Mario lander
Spork Sand
Statue of pizza
The Spiding hood
A card Convention
Sailor Potter
Shower Witch
The Little Pond
Spice of pokeman
Bill of Liberty
A spock
Count Drunk Doll of Princess
Petty fairy
Pumpkin picard
Statue of the Spice of the underworker

It still was fond of using made-up words, though. You’d be the only one at the party dressed as whatever these are.

Sparra
A masked scorby-babbersy
Scormboor
Magic an of the foand tood-computer
A barban
The Gumbkin
Scorbs Monster
A cat loory Duck
The Barboon
Flatue doctor
Sparrow Plapper
Grankenstein
The Spongebog
Minional marty clown
Count Vorror Rairol Mencoon
A neaving hold
Sexy Avical Ster of a balana Aly
Huntle starber pirate

And it ended up producing a few like this.

Sports costume
Sexy scare costume
General Scare construct

The reason? Apparently someone decided to help out by entering an entire costume store’s inventory. (”What are you supposed to be?” “Oh, I’m Mens Deluxe IT Costume – Size Standard.”) 

There were also some like this:

Rink Rater Ginsburg
A winged boxer Ginsburg
Bed ridingh in a box Buther Ginsburg
Skeleton Ginsburg
Zombie Fire Cith Bader Ginsburg

Because someone had entered about 50 variations on Ruth Bader Ginsberg puns (Ruth Tater Ginsberg, Sleuth Bader Ginsber, Rock Paper Ginsberg).

It invented some awesome new superheroes/supervillains.

Glow Wonder Woman
The Bunnizer
Ladybog
Light man
Bearley Quinn
Glad woman
robot Werewolf
super Pun
Super of a bog
Space Pants
Barfer
buster pirate
Skull Skywolk lady
Skynation the Goddess
Fred of Lizard

And oh, the sexy costumes. Hundreds of sexy costumes, yet it never quite got the hang of it.

Sexy Scare
Sexy the Pumpkin
Saxy Pumpkins
Sexy the Pirate
Sexy Pumpkin Pirate
Sexy Gumb Man
Sexy barber
Sexy Gargles
Sexy humblebee
Sexy The Gate
Sexy Lamp
Sexy Ducty monster
Sexy conchpaper
Sexy the Bumble
Sexy the Super bass
Pretty zombie Space Suit
sexy Drangers
Sexy the Spock

You bet there are bonus names – and oh please go read them because they are so good and it was so hard to decide which ones to fit into the main article. Includes the poop jokes. You’re welcome.

I’ve posted the entire dataset as open-source on GitHub.

And you can contribute more costumes, for a possible future neural net upgrade (no email address necessary).

Story titles, invented by neural network

lewisandquark:

So Prof. Mark Reidl of Georgia Tech is the best kind of geek, and used some cool scripting to extract all the things on Wikipedia with plot summaries: movies, books, tv episodes, video games, etc. That’s a lot of plot summaries: 112,936, to be exact. 

With a dataset this large, a neural network can achieve impressive results. Sure enough, when I trained this open-source neural network framework on just the titles alone, it consistently came up with titles that were both varied and (usually) plausible. 

Below are some of my favorites, arranged roughly by apparent genre:

Action/Adventure

Titanic Buffalo
Pirates: A Fight Dance Story
The Bad Legend
Conan the Pirate
O Bullets
Home Transformers
Shurk Hat Dies!
An Enemy of Bob (Homicide: Life on the Street)
Cannibal Spy II
American Hero: Fire of Crusty
Lego Man Hunt
Nancy Drew: The Last Day (film)
Surf Crisis
Legend of the Experience of Scarlet Freedom Damageboo
American Midnight: Swear Dragon
Problem

Scifi/fantasy

Under the Daleks
Batman and Flancles: The Fun Tree
The Legends of World Planet
Bomberman’s Love
The Enchanted Feed
The Star Wars: The Santa Contact
The Long Ninja Dove in the Air (film)
The History of the Galaxy Bunny Lada
City of the Stupid (film)
Shy Castle
Hamburger (Star Trek: The Next Generation)
Swords and Batman: Summer Party ?

Kids/Family

The Boordeeple (2011 film)
A Dog’s Toy Friends
Boop (Adventure Time)
A Dinosaur Quest
Colonel Corn (video game)
Scooby-Drum
New Bear
Borky the Pig (film)
Excellent Very Broken Christmas
The Great Bother Cat (film)
Happy Cat in the Yaku Wonder
Fireman and Halloween Rules
Big Can Flower Home
The Green Yaurglar Pig
Scooby-Doo’Wagon Traps (video game)
Book Dog (film)

Horror

Terror Dog
Tree Screaming
Zombies of Florence
The Trunkelling
A Vampire Time for Monster
Murder’s Eagle
Frozen Bat (film)
Haunted Place
The Sheep of Evil
Barney’s The Devil’s Treachery
Merry Scroobers: Crown of Evil
The Steel-Pounted Murder King
The Shadow of Life of Very Worgy (film)
The Mystical Booged of California

Documentary

Market that Knave
Spork at Bliss
The White Soup
An Indiana Office
The Last Fish Show
The Fish of Education

Restricted section (there were quite a few more of these)

Absilloved Lovers 2: Black Bearfly Dawn
Horse Man Academy 5-R: Cowboy Sheeper Wydex
Breed Bot 3: The Journey Kitchen
Wild Bad Party 109
Pink Moon
Indiscreet Maidman

And finally, a list of the most quintessential story titles, obtained by setting the creativity to near zero on a highly-trained network:

The Story of the Stranger (1994 film)
The Last Day of the Story
The Lost Princess (film)
The Stranger (1994 film)
The Last Star (1994 film)
The Secret of the Story of the Stranger (1996 film)
The Stranger (2014 film)
The Story of the Stars
The Story of the Stranger (1999 film)
The Last Day of the Sun
The Story of the Star Trek: The Secret of the Story of the Star Wars

The neural network has weird ideas about what humans like to eat

biff-donderglutes:

noseforahtwo:

lewisandquark:

So I’ve been training this neural network to generate cookbook recipes by letting it look at tens of thousands of existing recipes.

The generated titles can get a bit odd.

There’s a creativity variable I can set when the network is generating new recipes, and when I set it low, it comes up with its best guess at the most quintessential recipe titles:

Cream Cheese Soup
Cream Of Sour Cream Cheese Soup
Chocolate Cake (Chocolate Cake)
Chocolate Chocolate Chocolate Cake
Chocolate Chicken Chicken Cake
Chocolate Chocolate Chocolate Chocolate Cake
Chocolate Chips
Chocolate Chips With Chocolate Chips

When I tell it to get creative, things get even weirder.

Beef Soup With Swamp Peef And Cheese
Chocolate Chops & Chocolate Chips
Crimm Grunk Garlic Cleas
Beasy Mist
Export Bean Spoons In Pie-Shell, Top If Spoon and Whip The Mustard
Chocolate Pickle Sauce
Whole Chicken Cookies
Salmon Beef Style Chicken Bottom
Star *
Cover Meats
Out Of Meat
Completely Meat Circle
Completely Meat Chocolate Pie
Cabbage Pot Cookies
Artichoke Gelatin Dogs
Crockpot Cold Water

I’m Whip the Mustard.

There are tears in my eyes