Word2vec-Titelbild-klein

What’s cooking at STATWORX?

Jonas Braun Blog, Data Science

Can AI make you a better cook?

As an enthusiast hobby chef, I got quite excited after stumbling over possible applications of Word Embeddings during a Natural Language Processing course I had taken at university. For a semester project, our professor had suggested finding replacements for ingredients in cooking recipes by learning the relations between words from a large collection of recipes. The project sadly did not materialize, but the idea stuck with me ever since. So, I decided to finally find out, if AI could help me to become a better cook…

With this blog post, I would like to:

  1. detail how Word Embeddings work and can be learned
  2. train embeddings on a recipe dataset
  3. publicly reveal my trusted carrot cake recipe
  4. find replacements for all non-essential ingredients in the recipe, at least such that the end result may still be considered a cake
  5. bake both recipes and submit them to my picky colleagues for a thorough inspection
  6. impress Nigella?

Why the cake? At STATWORX, we simply love cake – and as a side effect, sugar helps us push the limits for our clients when it comes to coding! 😉 Every new employee and each birthday child will prepare a cake and become everybody’s best friend in doing so – at least for the day. Since I was up, I thought I’d give the whole thing a twist.

Word Embeddings

Simply put, Word Embeddings are a collection of numbers that represent a word. To the human eye, they are entirely meaningless, and will also look different every time you re-run the algorithm. The trick is to learn them once on a huge amount of text (think all Wikipedia entries for a large scale) and save them for further use. They may serve as inputs for some Machine Learning models and purposes, or in my case, I will be comparing them to find similarities between words.

So, very briefly, how are these embeddings learned? Imagine an arrow hovering over a word in a text; look at the 2 previous words to the left, and the 2 following words to the right (you can ignore all punctuation). These words make up the context for the word in the middle, as shown in the figure below.

window

We use a neural network to learn this context and capture it in the resulting Word Embedding of the word in the middle. You then move the arrow hovering over that word one place to the right and repeat the process for the next word. You go through your entire text multiple times until some maths tells you to stop. Many words will have appeared in many different contexts, and slowly the intricacies of the relations between all words in the text will be captured.

For instance, the words ‘Reagan’ and ‘Bush’ will be considered fairly similar, as they will have appeared in many similar contexts, whereas ‘sunflower’ and ‘airplane’ will seem dissimilar.

For every ingredient in the following recipe, I will be looking up words the algorithm has deemed most similar. We can assume that similar words have similar contexts they appear in, and in terms of cooking, that hopefully makes them adequate substitutes for the normal ingredients.

Coming to the original human-made recipe:

  • 350-400g grated carrots
  • 250g flour
  • 250g sugar
  • 250ml of a neutral-tasting vegetable oil
  • 4 eggs
  • Couple of squirts of vanilla aroma/essence/…
  • A tiny bit of salt
  • 4 teaspoons of cinnamon.

Taste the dough before baking and see if it can take some more, its crucial to commit here!

  • 1 pack vanilla sugar
  • 1 pack baking powder
  • 1 pack baking soda

(A bit less important if you don’t have one of the two, it’ll be more dense of a cake then, but better to add I think.)

  • 200g mixed unsalted nuts,

Put them in a towel, fold it up and bash with a rolling pin or the wine bottle you just emptied until it’s a variety of smaller and larger bits.

  • 200g cream cheese
  • 100g icing sugar

Mix everything in a bowl, no need to whisk the eggs. Grease pan and bake at 180°C circulation for 40 minutes. It’ll rise quite a bit, and when you test if the cake still sticks to the knife, you’ll probably need to bake it for another ~15min, but it’s alright if it’s still a little moist inside. Let it cool and mix 200g of cream cheese / Philadelphia with 100g icing sugar and spread on the cake once it’s cooled off.

Finding replacement ingredients

The following code excerpt shows the most similar words to the ones marked in red. The higher the score, the more similar the algorithm considers these words based on all the text that was given.

You can immediately see that the results need some inspection and cleaning, as the most similar word to carrots iscarrot. This is fairly common, and a good pre-processing of the data is worth a lot, especially in removing typos or special characters and connecting words like celery root to celery_root.

vec.most_similar(['carrots'])
[('carrot', 0.76),
 ('parsnips', 0.70),
 ('turnips', 0.69),
 ('celery', 0.64),
 ('rutabaga', 0.63),
 ('turnip', 0.63),
 ('parsnip', 0.61),
 ('green_beans', 0.59),
 ('baby_carrots', 0.58),
 ('celery_root', 0.55)]
##############################################

vec.most_similar(['vanilla_essence'])
[('custard_powder', 0.69),
 ('almond_essence', 0.67),
 ('ground_almonds', 0.66),
 ('vanilla_sugar', 0.65),
 ('essence', 0.64),
 ('castor', 0.63),
 ('condensed_milk', 0.63),
 ('caster_sugar', 0.62),
 ('bicarbonate_of_soda', 0.62),
 ('golden_syrup', 0.60)]
##############################################

vec.most_similar(['cream_cheese'])
[('cottage_cheese', 0.51),
 ('mascarpone_cheese', 0.48),
 ('sour_cream', 0.48),
 ('whipping_cream', 0.48),
 ('cream_cheeses', 0.45),
 ('creamy', 0.45),
 ('mascarpone', 0.44),
 ('ricotta_cheese', 0.42),
 ('heavy_whipping_cream', 0.42),
 ('marshmallow_creme', 0.42)]

All replacements are listed in the following table and while some might not be the highest-ranked suggestion, they are still deemed similar and I took my artistic liberties depending on the flavor of the day.

OriginalReplacement
carrotshalf parsnips, half celery root
wheat flourrye flour
sugarbrown sugar
vegetable oilolive oil
vanilla essencecustard powder
mixed nutswalnuts
cream cheesemascarpone

And here are the results…

Making the two cakes was fairly similar: while grating, the parsnips, and celery root felt a bit drier, yet the AI altered cake (pictured on the right) turned out more moist than the normal recipe (considered a plus by the test audience). The (subtle) taste of the parsnips and celery root went along nicely, adding a little gingerbread note to the cake. As cinnamon was added to both cakes, the overall taste was quite comparable. While I would’ve liked to add different spices to the second cake to make it more distinct, this does prove that the embeddings have come up with good alternative ingredients! Possibly for the next funky experiments, I should venture down the substitute rankings a little further to the midfield and give those a try.

cake

Conclusion

Overall, the cakes were very well received, with the taste being described as a lot more pleasant than the initial description sounded to some hesitant colleagues. I can only recommend you give either one a try!

So why this fun exercise? How do Word Embeddings help us overcome challenges for our clients? Capturing the meaning and relations between words is hugely important when identifying information in text. The embeddings provide a foundation for more complex tasks and models in Natural Language Processing and Machine Learning.

Say you wish to identify only certain key information in a newspaper article. While classical text-parsing methods will correctly classify the whole text, they do not assist you in identifying the few wanted goods from the flood of information. Enter Word Embeddings and Deep Learning: the relations provided by well-trained embeddings, together with Deep Learning models that take the context of a sentence into account, reliably help us in retrieving exactly what our clients want to know.

This does not stop on the word level either, extensions of this algorithm allow for entire documents or images to be assessed and compared. To end on a culinary note, recent research has used this to pair images with recipes, generating a recipe and cooking instructions for any image of a plated dish!

So, pick an evening, toss the tie and don the apron!

Über den Autor

Jonas Braun

Besides being a data scientist interested especially in deep learning, I enjoy the outdoors, anything culinary and also the arts. Bridging deeply rooted 'real world' elements with tech is what drives me.

ABOUT US


STATWORX
is a consulting company for data science, statistics, machine learning and artificial intelligence located in Frankfurt, Zurich and Vienna. Sign up for our NEWSLETTER and receive reads and treats from the world of data science and AI. If you have questions or suggestions, please write us an e-mail addressed to blog(at)statworx.com.