Happy Valentines day by Nerds

Real nerds on Valentines day graph hearts instead of drawing them. My drawing skills are not what I like them to be, my R skills are though! Therefore, let’s draw a heart in R instead on paper!

heart curve

dat<- data.frame(t=seq(0, 2*pi, by=0.01) )
xhrt<- function(t) 16*sin(t)^3
yhrt<- function(t) 13*cos(t)-5*cos(2*t)-2*cos(3*t)-cos(4*t)

with(dat, plot(x,y, type="l", axes=FALSE, frame.plot=FALSE, labels = FALSE, xlab = '', ylab = ''))
with(dat, polygon(x,y, col="#FF7575"))

Inspired by: http://mathworld.wolfram.com/HeartCurve.html


Willybot logo

Willybot The Movie – The Virtual Assistant that makes (y)our meeting fun again

* read in typical tell-sell voice *

Are you tired of long and dull meetings? Are your notes always lost? Do you have those awkward silences at the beginning of meetings, when everybody looks at each other who’s going to take notes? Those days are over!

We introduce you to Willy, created by seven Data Scientists at Cmotions with the latest state-of-the-art algorithms, this virtual assistant is here for you! Based on its patented algorithms, Willy can schedule your meetings AND take notes during the meetings. The notes are emailed to all the attendants, this means NO more cramps in your fingers, NO more forgetting to take notes and NO more losing your notes.


Does the meeting suck all the oxygen out of the room and the energy out of your body? Are you in desperate need of some coffee, espresso or cappuccino? Or rather have something to eat? Willy can order both! Willy can send your order to the coffee machine or order some sandwiches at your local lunchroom.

Is that all Willy can do? Of course not, with Partymode your Friday afternoon automatically turns into a festive blowout.

Warning: If your Willy shows a tendency towards world domination, please immediately unplug!

All kidding aside, the Willybot was an awesome project to work on. A project which emerged because we wanted to learn more about speech recognition and text analytics. A project of which we are quite proud, all features shown in the video actually work. Ask Willy to schedule a meeting and he’ll look through all participants agenda’s and find the first available time slot. Ask Willy to take notes and he’ll send a summary of the transcribed speech, together with a to do list, to all participants present. Willy is also connected to our coffee machine, kindly ask for an espresso and that’s what you’ll get. Hungry? Just tell your order to Willy and he’ll contact your local lunchroom and places an order.

Curious how we did this? Get in contact and we’ll tell you everything during a cup of coffee, which Willy ordered!


Project Friday 3.2 – Virtual Assistant with voice Recognition

In our previous post, we explained what happened when you put six data scientists in an apartment in Luik for the weekend. You’ll end up with a Virtual Assistant, which we named, Willy (we know the name doesn’t translate that well to English). Willy is able to understand us when we ask to schedule a meeting, he then checks the calendars of the colleagues in question and schedules an appointment at the first moment all colleagues are available.

rights perserved by https://gocomics.typepad.com

rights perserved by https://gocomics.typepad.com

This is part of what we call Project Friday, an initiative we use to do projects that don’t generate money, but do contribute to our knowledge about A.I. and Machine Learning, but also definitely contribute to the work atmosphere.

At Cmotions, when we’re at the start of the meeting, sometimes, someone suggests that it could be useful if someone writes down what was said and who has which actions points. However, it is always the case that the person that throws this on the table, doesn’t want to do this him/herself. But then again, neither do the others, which results in an awkward silence until the weakest link caves.

After Luik, our ambition wasn’t satisfied yet. Therefore, we reserved an office space in Utrecht, set a new goal for a new feature and went on with it! During this afternoon, we created a feature that, when during a meeting, you can lay your phone down on the table, record everything with Telegram -which can be considered as an alternative to Whatsapp-. The recording is sent to our Virtual Assistant who then checks which colleagues were present at the meeting, transcribes the total audio file and sends a summary of the meeting to the participants by mail (including a to do list per person).

Curious for more? We’ll post more in the future! In the upcoming articles we’ll tell you how speech recognition works and which features we implemented in the next Project Friday sessions. Can’t wait or want a real life demo? Drop by for a cup of coffee, or meet us at the Klantenservice Federatie on February, 6th in Bunnik, The Netherlands.

Virtual Assistant

Project Friday 3.1 – Willybot: our Virtual Voice Assistant

What do you get when you put six data scientists and a crate of beer in an apartment in Luik for the weekend? …a virtual voice assistant off course!

We have something we call Project Friday. Every now and then we lock ourselves in a room for the friday afternoon, or for the whole weekend in this case, and work on a project which doesn’t generate any money! We’re not your typical consultancy company. Why do we do this, actually two reasons: 1) learn the boundaries of the current state of A.I. and 2) just because it’s fun!

After we have put a brain and an eye to a coffee machine and played around with a drone inside our office, we decided it was time for a new project:  Willybot, ‘’the assistant of Cmotions’’. Bored from taking notes in meetings, and triggered by the Google Assistant (Duplex) that could make an appointment with a hairdresser. We decided to see whether we could build our own virtual assistant.

So how did we go about? Well we’ve put six colleagues in a humid apartment in Luik. Supplied them with beers, chips and a Azure virtual machine, and magic happened! After two hard days of work (and two nights of partying) we came up with a voice assistant that is able to summarise our spoken and written words, send automated summaries, make appointments, and create to do lists. And all by simply sending a voice message to our Willybot on Telegram.

What did we use:

  • Telegram
  • Google speech-to-text API
  • Azure virtual machine
  • Python

What did we learn so far?

  • The Google speech-to-text API is quite good at transcribing general Dutch conversations
  • We love Telegram (but we acutally already knew that)
  • We love Python even more (but again, nothing new here)
  • Luik is not the prettiest city but luckily we know how to party

In the coming months we will develop new features for our Willybot. Stay tuned for more blog-posts and video’s on this cool project. Or come and say hi to our Willybot in person in our office in Amersfoort.

Written by Sebastiaan de Vries


We crunch(ed) Artificial Intelligence Meetup – part three

For our third Meetup about putting Artificial Intelligence (A.I.) to practice, we were very happy to welcome Thomas Stalman and Peter van Lith. They had two quite different, but very interesting stories. And despite the beautiful weather and the national strike at the regional public transport, they had a full room of people to share their stories with.

20180627 Meetup (7)

Thomas started the evening with his story about welkrestaurant.nl. Have you ever found yourself in the situation that you were in a city which you didn’t know so well and were looking for a nice place to have some dinner? Fear no more, because now we have AI, in the form of welkrestaurant.nl, which can help you find a nice restaurant in the city you’re in, based on another restaurant you do know and like in any other Dutch city. During his presentation Thomas took us on the journey of building a recommender system using, amongst other information, text mining on the reviews people gave about a restaurant.

20180627 Meetup (22) 20180627 Meetup (4)


Peter started his talk by showing us some videos on (humanoid) robotics football, with his own humorous commentary in the voice-over. Which was a brilliant introduction into the subject of robotics football. Something the TU Eindhoven is really good at, seeing that they won the World Championship for the fourth time in the last seven years. Peter took us on a whole different journey than Thomas did, our second journey of the evening took us along the path of creating (distorted) images for training purposes, the training of neural networks combined with existing programmed behaviour of the robots and how this should work on the robots in the end.

20180627 Meetup (29) 20180627 Meetup (25a)

All in all we really enjoyed the evening and we want to thank our speakers, but also our audience! It was so great to see you all coming (again for some of you) to our headquarters in Amersfoort. Thank you for your attention and questions and we hope to see you at our next Meetup, which will be on Wednesday 17 October 2018. Please sign yourself up for our The Analytics Lab Meetup group, to make sure you will stay informed about our Meetups!

We definitely hope to see you there!


Let R/Python send messages when the algorithms are done training

As Data Scientists, we often train complex algorithms in order to tackle certain business problems and generate value. These algorithms, however, can take a while to train. Sometimes they take a couple of hours, hours which I’m not going to spend just sitting and waiting. But regularly checking whether the training is done, is also not the most efficient way.

Now I started to use Telegram to send me notifications from R and Python to let me know when training is done. Furthermore, I’m also using it for example to send me notifications when pipelines / ETLs fail, which allows me to repair them as soon as they fail.

It’s really easy, so I thought I’ll share my code!

First, after you’ve installed Telegram, search for the BotFather, which is a bot from the app itself. When you text /newbot, and follow the instructions, it will create your first bot and gives you a token. Copy this!

Next step is to find the id to send messages to. Find your bot in Telegram and say something. Then, go to your browser and go to https://api.telegram.org/bot<token>/getUpdates, where it should show you your chat id.

Finally install the necessary packages for R [install.packages(‘telegram’)] and / or Python [pip install telegram]. And you’re ready!

For R, use the following function:

send_telegram_message <- function(text, chat_id, bot_token){ require(telegram) bot <- TGBot$new(token = bot_token) bot$sendMessage(text = text, chat_id = chat_id) }

And this one for Python:

def send_telegram_message(text, chat_id, bot_token):
import telegram
bot = telegram.Bot(token=bot_token)
bot.send_message(chat_id=chat_id, text = text )

Cmotions (111 van 174)_bijgesneden

Education for the Next Generation: a Handsign recognition project in Python

“Could you create a handsign recognition model which we can use to teach High School students a bit more about A.I. in a fun way?”
This is the question a few colleagues asked a couple of weeks ago, and ofcourse, the only real response here could be YES! I was immediately enthusiastic and started working on this fun project.

After a lot of messing around with different models, among which xgboost and neural networks, I found a real goldmine. Which, in this case, was the GitHub page of loicmarie, where he created a script to not only train such a model using an Inception Model (convolutional neural network classifier), but also use it. So I combined my own script with the ones of loicmarie and we were ready to go!

The Inception Model V3 is a deep learning model created by Google based on images from ImageNet.


The Inception Model is capable of classifying images in 1.000 classes with an error rate like a human would have. An impressive model, which isn’t only cool on its own, but can also be used for Transfer Learning. Which means we can use the knowledge from this model and expand it with our own images. Which makes it quite “easy” and “fast” to create a good performing model on our own images, which, in this case, are different handsigns.

When we arrived at the High School, we first gave the students an introduction to what A.I. actually is and where they encounter A.I. in their world. After that we introduced them to our handsign recognition model and gave them the assignment to create their own handsigns.

Cmotions (41 van 174) Cmotions (58 van 174)

After which they used a script to take their own pictures for each handsign.

Cmotions (111 van 174)_bijgesneden

And then it was time for us to put our computers to work! It started with a script to generate 10.000 pictures for each handsign. As soon as this script was ready, the training of the model started.

WhatsApp Image 2018-05-17 at 07.29.20

After 23 hours (!) all the models were succesfully trained and it was battle time! The group who could write the most flawless text, using their own handsigns, within 5 minutes was the project-winner!

Cmotions (158 van 174) Cmotions (170 van 174)_bijgesneden


Check out this video to see how it works:


Are you interested in our code? Please feel free to take a look at our GitHub repo!


Project Friday 2.2: let’s fly!

A little while ago we started with our second Project Friday; once a month (or so) we’ll lock some colleagues in a room with a couple of beers and a fun project. This project: give a drone a brain and an eye, so we can call itand make it do stuff for us. Why do we do this? Well, because it’s fun. and we learn a lot.

During our first session we mostly discovered how difficult it was to maneuver the drone around inside our office building. This didn’t put us off even a tiny little bit, we love a good challenge!

We’ve spend most of this session on thinking of a way we can use the camera on the bottom of the drone to make it follow a path we’ve laid out for it. If we put it simply: we want to make the drone to be able to follow a line on the floor. The first thing we did was create a line made out of white adhesive tape on our dark carpet. After that we held the drone above this line to take pictures. And then the thinking started… We had to make sure we took every possible deviation into account and thought of the best way to correct the drone if that deviation occured. Believe it or not, but this drawing helped us do that.

deviationsWhile thinking of every possible deviation and the correction that had to be applied for that deviation, we immediately programmed it into our Python script for the drone. As soon as this script was done, it was time for our first test flight. Which you can see in this video:

Ok… Not succesful yet. Enough work left for some more friday afternoons. As far as we’re concerned: bring it on!


Read more about what we did before


Escape Room Hackathon – Bewakingsbeelden!

Ik, de wiskundige Piet A. Choras, zat tot voor kort onterecht vast voor overfitting en was mijn ontsnapping aan het plannen, ik kwam er iteratief achter dat ik het niet alleen kon en heb hulp ingeschakeld. Teams van analisten hadden een middag de tijd om mij te helpen te ontsnappen uit de Penitentiaire Inrichting (π) door (data) analyse vraagstukken op te lossen. De oplossingen van de (data) analyse vraagstukken onthulden puzzelstukjes die mijn ontsnapping zeker zouden stellen.

Via deze weg wil ik alle deelnemers ABN Amro, Aegon, ANWB, Marug, PGGM, RDC en Vivat bedanken voor hun overlevingsdrang, strijd, vechtlust en Brute Force om mij uit mijn cel te kunnen bevrijden. Ik ben er dan ook zeer trots op dat het gelukt is en ik nu kan genieten van vrijheidsgraden. Wel is mijn ontsnapping vast gelegd door de bewakingscamera’s, zie hier het resultaat:

Met vriendelijke groet,

Piet A. Choras



ANWB helpt Piet A. Choras bevrijden tijdens de Escape Room Hackathon 2018!

De jaarlijkse hackathon van The Analytics Lab en Cmotions is gewonnen door de ANWB. Voor de derde editie van dit evenement werd de Atoomclub in Utrecht omgedoopt tot een ware escaperoom.Tijdens deze escaperoom hackathon draaide het om het bevrijden van Piet A. Choras, die vast zit vanwege overfitting in ‘Penitentiaire inrichting (inderdaad, Pi) De Atoomclub’. De deelnemende teams kregen een aantal analytische vraagstukken voorgeschoteld, waarmee zij stap voor stap verder kwamen in het bewijzen van de onschuld van de wereldberoemde wiskundige.  Zo moest de locatie van een gestrand pakket worden achterhaald, een corrupte gevangenisbewaarder worden gevonden en middels een aantal visualisatietechnieken de periode dat Piet A. Choras vast zat, worden berekend.

Opmerkingen van de bewakers en het verhoor met de gevangenisdirecteur werden zorgvuldig aangehoord en opgeslagen, wetende dat in elk detail een hint verstopt kan zitten. Ook in de aankleding van de gevangenis gingen de deelnemers op zoek naar aanknopingspunten: van de beeltenissen van historische wiskundigen tot de boekenkast van de Atoomclub, alle attributen werden met chirurgische precisie bestudeerd.

Door subtiele en minder subtiele hints, werden de deelnemers gaandeweg op het juiste spoor gezet, waardoor alle teams zich tijdens de finaleronde konden vastbijten in een classificatiemodel om Piet A. Choras te bevrijden. Op een mooie tweede plaats, op gepaste afstand van de overige deelnemers, is Vivat geëindigd. Maar net als vorig jaar bleef ANWB de concurrentie ruimschoots voor. Zij prolongeerden hun titel en wonnen de felbegeerde snoeppot en een bezoekje aan een “echte” Escape Room.

Gefeliciteerd ANWB!