Our client, a leading financial services company is hiring a DATA Engineer/Developer on a long-term contract basis.
Job ID 79240
Work Location:
New Castle, DE
Summary:
The Big Data platform supports operational real-time event-based processing and compute applications as well as analytics consumption use cases including machine/deep learning. This is a hands-on development role that will offer exposure to the full development cycle, whilst working closely with our business and technology stakeholders.
Responsibilities:
- Real-time ingestion/stream processing and data distribution via Big Data APIs.
- Build out canonical models and data conformance.
- Implement best in class data management and data ingestion.
- Leverage new storage engines like Kudu that enables analytics on fast changing data.
- Leverage GPU implementation to enable advanced Machine Learning.
- Enhance Self-Service capability for data science and ML practitioners.
- Analysis and development across Lines of business including Payments, Digital Channels, Liquidities, Trade.
- Cross train and fertilize functional and technical knowledge.
- Align to Engineering Excellence Development principles and standards.
- Promote and increase our Development Productivity scores for coding.
- Fully adhere to and evangelize a full Continuous Integration and Continuous Deploy pipeline.
Required Skills:
- 10+ years of experience in Hadoop/big data technologies.
- Experience with Spark/Storm/Kafka or equivalent streaming/batch processing and event based messaging.
- Relational and NoSQL database integration and data distribution principles experience.
- Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr).
- Experience with API development and use of JSON/XML/Hypermedia data formats.
- Strong development/automation skills.
- Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.).
- 5+ years of hands-on experience as a Scala developer (with previous Java background).
- Experience in Core Banking functionality for generating various hand-offs is preferred.
- Experience with containerization and related technologies (e.g. Docker, Kubernetes) is preferred.
- Comprehensive knowledge of the principles of software engineering and data analytics is preferred.
- Knowledge of Agile(scrum) development methodology is a plus.
- Cloudera/Hortonworks/AWS EMR, S3 experience a plus.
- Strong Communication skills.
- Self-Motivated.
- Willingness to learn.
- Excellent planning and organizational skills.
Education:
Strong academic record, ideally with a Bachelor degree in Engineering/mathematical or scientific background.