- 岗位职责
Artefact是法国一家已上市的Data咨询公司,致力于帮助客户发掘数据的价值。
What you will be doing: Key responsibilities
As a Data Engineer, your role will encompass:
● Conducting ambitious projects in the transformation of clients through data
● Collaborating with the other Divisions (Activation, Creativity, and Strategy) to provide comprehensive services to your clients
● Developing privileged relationships with our clients, using your technical abilities to assist in the transformation of their marketing department
Among your responsibilities as a Data Engineer, you will be responsible for:
Performing data projects
● Securing delivery on your projects
● Communicating the success of your projects among the company
● Working closely with your Consulting counterpart to build and maintain strong relationships with your clients and best understand their needs
● Ensuring that your solutions are bringing values to the client problematic
● Being a good team player, knowing your role and responsibility in the global ambition
● Caring for the happiness of the team, ensuring work is delivered to a high standard and providing feedback and mentoring
Being a great tech person
● Demonstrating the skill and credibility required to ensure the success of our clients’ initiatives
● Researching and developing new technical approaches to address problems efficiently
● Sharing best practices and contributing to Artefact’s institutional knowledge
● Embodying Artefact’s values and inspiring others to do the same
● Experience working with Big Data technologies such as Hadoop, Spark, Kafka...
● Delivering Data Lake / Big Data projects including (data ingestion, machine learning model application, code deployment)
● Being able to adapt your solutions and approaches to technical environment (security, access, tools)
- 岗位要求
Qualifications :Education and Experience
● Bachelor’s degree in computer science, computer engineering, or related fields
● 1+ years of hands-on experience developing and applying data-driven solutions
● Strong experience and knowledge of computer science, data processing and data architecture
● Intellectual curiosity and excellent problem-solving skills, including the ability to structure and prioritise an approach for maximum impact
● Experience in designing Data Integration / ETL
● Experience with Python, Linux, SQL, Cloud infrastructure and Docker
● Experience with Spark, BigQuery/ Redshift, Devops and Javascript is a plus