Our client is a leading manufacturing platform, headquartered in Tokyo and setting up a subsidiary in Ho Chi Minh City this year.
Leveraging technology, they are unleashing the true potential of manufacturing industry, by bringing in Digital Transformation.
With a mission to enable manufacturers across the world to focus on their area of expertise and perform at their maximum potential, they have secured a top-rating of 9X growth rate in terms of their corporate value.
They are looking for driven professionals to join their fast-paced, flexible and flat leadership environment, where candidates will get excellent learning & career growth opportunities, while creating a positive impact in the industry.
You will be responsible to :
Design, architect, and implement both batch and streaming data processing infrastructure
For specific algorithms, build and maintain training pipelines
Analyze data required for algorithm design
Support implementation of algorithms
Design and implementation of data infrastructure
Manufacturing cost estimation data
Transaction performance data
Manufacturing control process data
Usage data for various products
Analysis of base data to obtain hypotheses concerning business and operational improvement
Build data processing pipelines in cooperation with algorithm designers
Understanding the firm's mission to unleash the potential of manufacturing
You possess a degree in Computer Science, Applied Mathematics, Engineering or related field.
You have at least 3 years experience, in a similar role, have solid experience using SQL for analysis
General knowledge of Linux-based infrastructure and public cloud services is required.
You are familiar with container technologies such as Docker, Kubernetes etc. and have experience using version control systems such as Git.
You understand computer networking and preferably have hands-on experience with declarative infrastructure-as-code technologies (terraform, puppet, ansible, etc)
Experience with cloud providers such as AWS, GCP, or Azure is required
You also have experience with data pipeline technologies such as Airflow, Apache Beam, Spark, etc.
Experience monitoring data pipelines with modern solutions such as Datadog, New Relic, Grafana, etc. and knowledge of production use cases of Elasticsearch is a plus.
Experience in application development or analysis work with Rust, Python, MATLAB, or R would be advantageous.
Proficiency in English is required to collaborate and work with the team in Japan.
What's on Offer?
Join a leading SaaS platform in manufacturing sector
Benefit from an intensive training program