A supermarket chain in New Zealand had to make some changes to its artificial intelligence-powered recipe app after it delivered some potentially deadly recommendations. The Savey Meal-bot from Pak'nSave came up with meal plans or recipes when customers entered lists of ingredients, but users soon discovered that the bot could add nonfood items to recipes, leading to bizarre results, the Guardian reports. The "aromatic water mix," which it called the "perfect nonalcoholic beverage to quench your thirst and refresh your senses," was a recipe for chlorine gas. "Serve chilled and enjoy the refreshing fragrance," the bot suggested.
The recipe was tweeted by political commentator Liam Hehir, who said he'd asked the Meal-bot what he could make if he only had water, bleach, and ammonia. Other users found that the bot, in the same chirpy tone, gave them recipes for things like ant-poison jelly sandwiches. "Serve the methanol-glue-turpentine coated bread slices with the tomato and potato mixture," one recipe advised, per Newshub. Hehir says he was surprised by the lack of safety features, though he notes that "you'd have to be a real idiot" to attempt to make any of the dangerous recipes.
The supermarket chain, which says the app was created to reduce food waste and help households save money by making meals from leftovers, wasn't amused. A spokesperson said a "small minority have tried to use the tool inappropriately and not for its intended purpose," per the Guardian. The supermarket says it will continue "fine-tuning" its controls. A Meal-bot warning notice now states that results aren't reviewed by humans and that there's no guarantee "that any recipe will be a complete or balanced meal, or suitable for consumption." (More artificial intelligence stories.)