Добавено на 4 months ago

Data Engineer

Населено мястоSofia grad, Sofia, Bulgaria PP Adecco

ПозицияSpecialist

Дата2020-05-05

КатегорияBanking and Insurance




Data Engineer


Adecco is a Fortune Global 500 company and the global leader in HR services. The Adecco Group connects over 500,000 external colleagues with clients each day through its network of more than 6,700 branches, with 36,000 full-time employees in over 60 countries and territories around the world.
On behalf of our client - a well-known international company in Financial services - we are looking to hire a Data Engineer
Position Overview:
The selected candidate is responsible for the data architecture and data management activities within the ITS Data & Analytics Work Program. He/she should have a strong foundation and hands-on experience in Data & Analytics data management design and development, debugging Business Intelligence (BI) existing applications as well as performance tuning.
•       Prepare the data as part of an ETL or ELT process, and perform transform-load design and development
•       Integrate new data management technologies and software engineering tools into existing structures
•       Work with application DBA and modelers to construct data stores
        The selected candidate will work closely with the team to provide technical leadership to facilitate development projects that involve the computing environment which may include the coordination of software upgrades and the installation of new products. He/she will design and validate data solutions that are practical, flexible, scalable, reusable and strategic. These efforts enhance data quality, enrich access and provide business decision makers with information upon which they can make more accurate and effective decisions across multiple domains.
•       Ensure data is ready for use by consuming application, analyst and scientist using frameworks and microservices to serve data
•       Collaborate with data architects, modelers and IT team members on project goals

Requirements:
The candidate should have the following core skill sets:
•       At Least 7 years of hands-on experience of complex ETL mappings, building data pipelines to collect data and move it into storage, and workflows in a Cloud-based ETL tool such as IICS and Azure Data Factory/SSIS
•       Ability to build and interact with large data processing pipelines in distributed data stores, and distributed file systems
•       Strong programming and algorithmic skills with the ability to stitch data together with scripting languages
•       Develop data set processes for data modeling, mining and production
•       Experience with managing and maintaining quality data by performing operations such as cleaning, transformation and ensuring integrity in both on-prem and a Cloud-based relational database environment
•       Ensure data is ready for use by consuming application, analyst and scientist using frameworks and microservices to serve data
•       Collaborate with data architects, modelers and IT team members on project goals
•       Familiarity with various process modeling techniques (activity hierarchy diagrams, data flow diagrams,
sequence diagrams, workflow diagrams, system interface diagrams, etc.)
•       Technical Degree in Computer Science/Engineering or related field

Required technologies:
•       Tools: Informatica Cloud Services (IICS), Informatica PowerCenter 10.x, Azure Data Factory
•       Database: Oracle 12c/18c/19c, Azure SQL DW, SQL Server 2016/2014/2012/2008
•       Cloud Technologies: Azure Microsoft Technologies, Unix, Linux, Windows
•       Design Language: Visio, Erwin 9.x and above

If you recognize yourself in the description above, send us your CV in English today!
Only short-listed candidates will be contacted. All applications will be treated in strict confidentiality.
Recruitment license from National Agency of Employment № 1814 from 08.12.2014