Data Engineer - Mox

Data Engineer - Mox

Standard Chartered
Not Specified
Not Specified

Job Description


About Standard Chartered
We are a leading international bank focused on helping people and companies prosper across Asia, Africa and the Middle East.
To us, good performance is about much more than turning a profit. It's about showing how you embody our valued behaviours - do the right thing, better together and never settle - as well as our brand promise, Here for good.
We're committed to promoting equality in the workplace and creating an inclusive and flexible culture - one where everyone can realise their full potential and make a positive contribution to our organisation. This in turn helps us to provide better support to our broad client base.
As one of the biggest banks in market, we are rapidly expanding by growing a new virtual banking business in Hong Kong. We see ourselves as a fast growing start-up company where you will enjoy autonomy and teamwork at the same time, solving new and exciting problems in a nimble and agile way. Join us and be part of history making for future banking experience!
The Role Responsibilities
As a Data Engineer you'd be working with us to design, maintain, and improve various analytical and operational services and infrastructure which are critical for almost every other function in the organization. These include the data lake, databases, data pipelines, large-scale batch and real-time data processing systems, a metadata and lineage repository, all of which work in concert to provide the company with accurate, timely, and actionable metrics and insights to grow and improve our business using data. You may be collaborating with our data science team to design and implement processes to structure our data schemas and design data models, working with our product teams to sensibly integrate new data sources, or pairing with other data engineers to bring to fruition cutting-edge technologies in the data space.
Our Ideal Candidate
We expect candidates to have in-depth experience in a subset of the following skills and technologies and be interested in filling any gaps in knowledge as needed on the job. More importantly, we seek people who are highly logical, with a balance of respect for best practices and using their own critical thinking, adaptive to new situations, capable of working independently and delivering projects end-to-end, communicates fluently in English, collaborates well with teammates and stakeholders alike, and eager to be a part of a high-performing team, taking their careers to the next level with us.

  • General computing concepts and expertise: Unix environments, networking, distributed and cloud computing

  • Python frameworks and tools: pip, pytest, boto3, pyspark, pylint, pandas, scikit-learn, keras

  • JVM languages, frameworks and tools: Kotlin, Java, Scala / Maven, Spring, Lombok, Spark, JDK Mission Control

  • Agile/Lean project methodologies and rituals: Scrum, Kanban

  • RDBMS and NoSQL databases: MySQL, PostgreSQL / DynamoDB, Redis, Hbase

  • Columnar and big data databases: Athena, Redshift, Vertica, Snowflake, Hive/Hadoop, Spark

  • Version control: git commands, branching strategies, collaboration etiquette, documentation best practices

  • General AWS services: Glue, EMR, EC2, ELB, EFS, S3, Lambda, API Gateway, IAM, Cloudwatch, DMS

  • Container management and orchestration: Docker, Docker Swarm, ECS, EKS/Kubernetes, Mesos

  • CI / CD tools: CircleCI, Jenkins, TravisCI, Spinnaker, AWS CodePipeline

  • Distributed messaging and event streaming systems: Kafka, Pulsar, RabbitMQ, Google Pub/Sub

  • Workflow scheduling and monitoring tools: Apache Airflow, Luigi, AWS Batch

  • Enterprise BI tools: Tableau, Qlik, Looker, Superset, PowerBI, Quicksight

  • Batch and streaming data processing frameworks: Spark, Spark Streaming, Apache Beam, Apache Flink

  • Data science environments: AWS Sagemaker, Project Jupyter, Databricks

  • Log ingestion and monitoring: ELK stack (Elasticsearch, Logstash, Kibana), Datadog, Prometheus, Grafana

  • Metadata catalogue and lineage systems: Amundsen, Databook, Apache Atlas, Alation, uMetric

  • Data privacy and security tools and concepts: Tokenization, Hashing and encryption algorithms, Apache Ranger


Apply now to join the Bank for those with big career ambitions. Visit Mox.com for more information!
To view information on our benefits including our flexible working please visit our . We welcome conversations on flexible working.

Job Details

Employment Types:

Function:

IT

Job Source : scb.taleo.net

Similar Jobs

Career Advice to Find Better