Degree programme: MSc. in computer science with the focus of intelligent systems
Career: ● Originally from Tunisia, living in Germany since 5 years ● Telecommunication engineer (Sup’com Tunisia) ● holding a Master degree from university of Passau 2016-2018 ● Deep learning and Computer vision enthusiast ● Working at Zalando in Berlin for two years with the Outfits team
What defines your job Mandatory requirements are: ● Deep understanding of the theory behind Machine Learning(ML) and Deep Learning ● Great coding skills in Python and hands-on experience with ML libraries such as TensorFlow ● Ability to build prototypes (ML experiments) and to work against uncertainty of the research task ● Experience with designing and building ML solutions for production systems ● (Optional) experience with Cloud Computing frameworks (e.g., AWS, Google Cloud)
Necessary soft skills: ● Communication skills (English) with different type of audience (product managers, software engineers, data engineers) ● Presentation and writing skills ● Ability to define/understand customer problem of online fashion retailers ● Creativity ● Proactivity
What I like most: ● Applying machine learning and deep learning techniques inspired from natural language processing to solve a customer problem, mainly outfit recommendation ● Having an excellent data science community at Zalando, that motivates me to keep learning ● The excitement of bringing ML products to our customers and seeing their engagement through A/B test results ● Being part of an international team who is working hand-in-hand to bring fashion inspiration to Zalando customers
Most exciting challenge so far: ● Kick Starter week at Zalando: a hackathon week to develop an idea that solve a customer problem ● Bringing a new ML algorithm for outfit recommendation live ● Being up-to-date with the latest trends in the field of deep learning, NLP, and computer vision ● Working on experimentations for a research paper around algorithms that we use for outfit recommendation
Don’t get discouraged by: ● Data preparation and processing ● Failing research experiments ● Time needed to deliver ML products ● The amount of tools that you need to learn about ● The different roles that you have to play (data-engineer, research engineer, software engineer) ● Not following all the new trends in academic research
My job is especially suitable for people studying: Computer science with the focus of Machine learning
Job Discription: My job can be described with five parts: define, experiment, iterate, deliver and monitor. I work closely with product management and senior applied scientists to identify impactful research problems to bring value to Zalando customers. We define the needed data, state a clear hypothesis and define success criteria for the tackled research task. I autonomously conduct experiments following a well defined research plan. I proactively share results and insights. Once the experiment phase is validated, I work together with the team to deliver the ML solution. A continuous monitoring of the quality of the delivered solution is a part of the process.
Speaker II: Nithin Thomas
Company: ZF Friedrichshafen AG
Position: Application Developer in Data Analytics area
Degree programme: MSc. Computer Science
Career: • From Kerala, India • Worked as Software Quality Analyst after bachelor’s during 2012-2015 • Master’s from Uni Passau - 2015-2017 • Decided to become a Data Scientist during my master thesis • Working at ZF for almost 3 years, after starting as an intern in 2018
What defines your job
Mandatory requirements are. • Education in Computer Science focusing on machine learning and/or data analysis • Experience in Data Science/Machine Learning domain, especially using KNIME and Python • Experience working in agile environment • Ready to learn new technologies and tools
Necessary soft skills: Good communication skills in English; German is beneficial, Team player, Problem solving, Openness to criticism, Creativity
What I like most: • Data analysis: Finding hidden, valuable information for customers from unstructured data • Learning: New algorithms/methods for data analysis, new technologies and tools for deployment • Architectural designs: Identifying and implementing infrastructure for product deployment • Discussions with colleagues: Talking with colleagues on new trends and problems in the projects • Being part of a new, energetic, and diverse team in a big company
Most exciting challenge so far: • Working with KNIME, a citizen data science tool, was a new experience as well as a challenge • Being part of Infrastructure design team is still exiting since I have never done it before • Generalization of products is one of the major challenges we always face as a team
Don’t get discouraged by: Time consuming task of data preparation, Occasional changes in interim goals, When your solution doesn’t work out
My job is especially suitable for people studying: Computer science with focus on Artificial Intelligence/ Machine Learning/ Data Science
Job Discribtion: We follow CRISP-DM process model for our data science activities. As a start of the project, we usually have a workshop with the customer (internal and external) to achieve ‘Business understanding’. Next step is ‘Data understanding’ which involves data collection, initial clean-up of the data, and multiple meetings with the cus-tomer. Then we do ‘Data preparation’ followed by ‘Modelling’. Results from the data preparation and modelling will be discussed with the customer and go through the cycle again if necessary. Once we achieve our goals, then we move on to ‘Deployment’. Other than normal data science projects and products, we are also involved in tasks to create a better infrastruc-ture for our product catalogue. Design architectures with experts for products and DevOps is one of the important tasks in my job profile.
Informationen zur Anmeldung: Zur Teilnahme am Webinar ist ein Eintrag in Stud.IP nötig. Der Zugangslink wird zeitnah vor dem Termin in der Veranstaltung bekannt gegeben.
In Kooperation mit dem iStudiCoach
The enrolment is binding, participants cannot unsubscribe themselves.