Data engineers are mainly tasked with transforming data into a format that can be easily analyzed. They do this by developing, maintaining, and testing infrastructures for data generation. Data engineers work closely with data analysts and are largely in charge of technical solutions to enable more efficient and timeline data analysis
Data engineers are often responsible for building approaches, pipelines and algorithms to help give easier access to raw data, but to do this, they need to understand company’s or client’s objectives.
Data engineers also need to understand how to optimize data retrieval and how to develop dashboards, reports and other visualizations for stakeholders.
Reporting Line: Team lead
Develop, construct, test and maintain architectures.
Align architecture with business requirements.
Data collection: Collecting, measuring and analyzing.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Conduct research for industry and business questions.
Deploy sophisticated analytics programs, machine learning and statistical methods
Prepare data for predictive and prescriptive modeling.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Know SQL, ER model, normalization, ACID Transactions.
Experience and deep understanding with relational SQL, NoSQL, Document data, Wide Column, Graph and Key value
Document: MongoDB, Elasticsearch..
Wide Column: Hbase, Cassandra, GG Bigtable..
Key-Value: Redis, Memcached, Amazon DynamoDB..
Data Warehouse: Experience with concepts data modeling like snowflake, Golden Gate, Google BigQuery, Dremio, Presto..
Know most modern data processing frameworks like Arrow, Spark, Flink, Kafka... beyond traditional data processing and workflow scheduling: NIFI, Airflow, ODI, SSIS...
Know most monitoring: Prometheus, Grafana, Splunk ...
Nice to have:
Infrastructure as Code: Container, Infa provisioning
Machine learning: Terminology, TensorFlow, Pytorch...
Machine learning OPS: TensorFlow Extended, Kubeflow, Seldon, Mlflow, GG AI Platform.
Python and/or Java programming
Thông tin khác
13th salary, rewards for achievements, initiatives and good deeds
Annual leave: 15 – 20 working days/year and Other leaves/public holidays
Providing customized training courses according to business needs and upon your request
Nơi làm việc
- TOWER 2, Phố Minh Khai, Vinhomes Times City, Vĩnh Tuy, Hai Bà Trưng, Hà Nội, Việt Nam
Chú ý: Toàn bộ thông tin đăng tải thuộc quyền sở hữu của ONE MOUNT GROUP. Chúng tôi chỉ đang cố gắng đưa thông tin nhanh nhất và chính xác nhất tới các bạn. Trường hợp phát hiện có nội dung không chính xác, các bạn có thể thông báo bằng cách liên lạc với chúng tôi qua cửa sổ liên lạc phía dưới-góc phải màn hình.