Contact Us 877.823.3669

Contract Data Engineer (No C2C) in Hartford, CT at SNI Technology

Date Posted: 4/11/2018

Job Snapshot

Job Description

6 Month Contract --- NO Corp to Corp


We have a client in Hartford that is looking for a very strong Data Engineer to work on a project with them for the next 6 months.

Role:
The Analytic Data Engineer architects and engineers solutions associated with analytic data for the organization (Advanced Analytics & Reporting) and, working closely with the IT teams, assists with the design, build, and upkeep for these solutions. This includes creating pathways for analysts to access operational, derived, and external data sets. The incumbent is responsible for the operation of Data Platforms as they are associated with analytic data discovery.

Qualifications and Skills

  • A strong understanding of enterprise data design and architecture with significant experience using SQL on traditional databases (e.g. Oracle, MS SQL Server, PostgreSQL)
  • Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management. This is usually paired with experience using an ETL tool (Extract, Transform and Load)
  • Experience working with large data sets
  • Knowledge of at least one or more enterprise languages (C/C++/C#/Java) and scripting languages (Python, Ruby, Perl)
  • Experience with data visualization tools (e.g. Tableau, PowerBi)
  • Expertise working with one or more mortgage analytics providers (1010Data, Trepp, CoStar, Xceligent, Intex, YieldBook, Barclay's Point, etc)
  • Strong knowledge about data modeling, data access, and data storage techniques
  • Experience of building high level data pipelines.
  • Experience with Agile development
  • High level of expertise with cloud platforms - Microsoft Azure ideally.
  • Experience with newer data warehouses (e.g. Amazon Redshift, Azure Data lakes) a plus
  • Experience working with high volume, heterogeneous data using distributed systems such as Hadoop, BigTable, and Cassandra a plus
  • Familiarity architecting, building, and operating large-scale batch and real-time data pipelines with data processing frameworks like Scalding, Scio, Storm, Spark and Dataflow a plus
  • Bachelor's degree in Computer Science or in a similar field.