Search
🧑🏻‍💻

[CV] Junseong Kim

Machine Learning Engineer at ScatterLab I am a developer who is serious about making products that change the world.
Machine Learning Engineer, ScatterLab
codertimo@gmail.com / +82 10-5068-2283

Experience

ScatterLab (4 year, 2 months)

We are building AI friends (Luda Lee) to provide the relationship for everyone.
1st place in Apple App Store and Google PlayStore at Social Networking Category

Product Owner, Full-Time

Machine Learning Engineer, Full-Time (Oct 2021 - Present, 1 yr 2 mos)

Project Lead. Redeveloped all ML inference services from scratch for the bigger model update. Optimized with AWS spot instance, AWS inferentia with TFServing, FasterTransformers with GPU, FastAPI, Kubernetes, and other techniques for best performance per cost.
Project Lead. Optimized GPT inference server for real-time text generation. 3x faster than baseline.
Project Lead. Optimized RoBERTa inference speed using IPU. achieved 5x lower cost and 2x faster!
Optimized large-scale TPU Pod Tensorflow training; achieved 3x faster training speed.
Developed internal data labeling tool backend and data sampling in large raw conversations data.
Managed research team GCP environment for convenient and cost-effective cloud working space.

Machine Learning Research Scientist, Full-Time (Oct 2018 - Sep 2021, 3 yr)

Project Lead. Researched dialog response retrieval in 10M+ response candidates using ANN.
Project Lead. Researched automated person name de-identification for data privacy protection.
Project Lead. Researched a similar query-matching model to reply with persona-oriented responses.
Researched intent classification task to classify given context is purposed of goal-oriented or chitchat.
Researched noun and keyword recognition model to catch the dialog topic.
Researched the next dialog topic prediction model in the multi-turn context.
Developed language model fine-tuning and evaluating pipeline with 5 NLU tasks using Kubeflow
Developed the Pingpong-Builder backend logic at the prototype stage.

Naver, Clova AI (3 months)

Research Internship (Jul 2018 - Sep 2018 · 3 mos)

Prototyped goal-oriented dialog system for a restaurant reservation
Implemented popular arixv papers using Pytorch (internship assignment)

Atlas Labs (1 years, 1 months)

Machine Learning Engineer, Full-Time (Jun 2017 - Jun 2018 · 1 yr 1 mo)

Domain-specific neural machine translation task for shopping website
Name entity recognition/intent classification for airline booking chatbot
Intent classification task for card company chatbot
Audio/Transcription Data collection & management

Education

Computer Science, Kookmin University (2017 - Present)

At 2018-2021, Temporally paused study for 3 years due to millitary service
Currenly studying and working together
Expected Graduation timeline: June, 2023

Korea Digital Media High School (2014 - 2017)

Hacking & Defence Department
Founder of Computer Science Club "Aperture" (Studying ML and IoT devices)

Publication

KLUE: Korean Language Understanding Evaluation (NeurIPS 2021 · Oct 8, 2021)

Contribution: Lead of machine reading comprehension task (KLUE-MRC) / https://openreview.net/forum?id=q-8h8-LZiUm
We introduce Korean Language Understanding Evaluation (KLUE) benchmark. KLUE is a collection of 8 Korean natural language understanding (NLU) tasks, including Topic Classification, SemanticTextual Similarity, Natural Language Inference, Named Entity Recognition, Relation Extraction, Dependency Parsing, Machine Reading Comprehension, and Dialogue State Tracking. We build all of the tasks from scratch from diverse source corpora while respecting copyrights, to ensure accessibility for anyone without any restrictions. With ethical considerations in mind, we carefully design annotation protocols. Along with the benchmark tasks and data, we provide suitable evaluation metrics and fine-tuning recipes for pretrained language models for each task. We furthermore release the pretrained language models (PLM), KLUE-BERT and KLUE-RoBERTa, to help reproducing baseline models on KLUE and thereby facilitate future research. We make a few interesting observations from the preliminary experiments using the proposed KLUE benchmark suite, already demonstrating the usefulness of this new benchmark suite. First, we find KLUE-RoBERTa-large outperforms other baselines, including multilingual PLMs and existing open-source Korean PLMs. Second, we see minimal degradation in performance even when we replace personally identifiable information from the pretraining corpus, suggesting that privacy and NLU capability are not at odds with each other. Lastly, we find that using BPE tokenization in combination with morpheme-level pre-tokenization is effective in tasks involving morpheme-level tagging, detection and generation. In addition to accelerating Korean NLP research, our comprehensive documentation on creating KLUE will facilitate creating similar resources for other languages in the future. KLUE is available at this https URL.

Presentations

Had a presentation in PyCon 2019 "Building open-domain conversation model with 1B+ dataset"
Had a presentation in Naver Tech Talk 2019 "Building open-domain conversation chatbot" with CEO.

Awards

The Final Winner of JUNCTION 2022 Hackathon

Issued by JUNCTION 2022, Nov 2022
JUNCTION 2022 is the biggest hackathon in the Europe. 1300+ participants, 200+ teams to get into the challenges.
4 Awards at Hackathon. Grand Winner, Google Cloud Side Challenge Winner, Mental Health Track Winner, Reaktor Challenge Winner

The Final Winner of JUNCTION ASIA 2022 Hackathon

Issued by JUNCTION ASIA 2022 · Aug 2022
JUNCTION 2022 is one of the biggest hackathons in the APAC.

Finalist of 2017 Intel ISEF(International Science and Engineering Fair)

Issued by Society for Science & the Public · May 2017

Best Category(Computer Science) Award in KSEF(National Science and Engineering Fair)

Issued by KSS(Korea Science & Engineering Society) · Feb 2017

3rd place in Korea Olympiad in Informatics (National SW Project Competition)

Issued by NIA(National Information Society Agency) · Aug 2015

Finalist of 2014 Intel ISEF(International Science and Engineering Fair)

Issued by Society for Science & the Public · May 2014

Best Category(Computer Science) Award in 2014 ISEF-K (National Science and Engineering Fair)

Issued by Korea Foundation for the Advancement of Science and Creativity · Feb 2014

2nd place in Korea Olympiad in Informatics (National SW Project Competition)

Issued by NIA(National Information Society Agency) · Jul 2013