Senior Data Engineer

New York, United States · Engineering expand job description ↓

Description

At Urbint, our mission is to make communities more resilient. We do this by pairing external data with artificial intelligence to identify areas of high risk and prevent catastrophic loss for utilities and infrastructure operators across the country. We are a team of close-knit engineers, entrepreneurs, and data geeks who obsess over problem-solving, new technologies, and making a positive impact in our communities.


At Urbint, you will collaborate within a cross functional team of other software engineers, product designers, machine learning engineers and product managers to architect, create and maintain custom data pipelines, data lake(s) and a data warehouse to feed our customer facing applications and internal experimentation and modeling teams. Our data sources range from large, high security enterprise databases and storage services hosted by our customers to disparate open data from government organizations to proprietary third party aggregators and auditability and security are very important to their results. You will be working to solve direct customer needs and finding common abstractions to apply and guide the future of information at data driven organization.

Requirements

  • 8+ years of software development experience focused on web technologies including significant production work with Python
  • 3+ years of experience designing, building and maintaining enterprise data pipelines and/or warehouses
  • High level knowledge of machine learning algorithms and how ML models are built and deployed
  • Demonstrable knowledge of big data databases such as columnar data stores (e.g. Cassandra or BigTable) or Hadoop as well as SQL (MySQL, MSSQL or PostgreSQL) and ability to select the right tool for the job
  • Experience with queued work management and message processing (e.g., Kafka, RabbitMQ)
  • Experience working closely with product and account support personnel to help prioritize the best solutions to the largest problems.
  • Reliable organization and communication skills and follow through on verbal and written commitments.
  • Persistent approach to problem-solving and ability to see solutions through to completion even in the face of complexities or unknowns. A proactive mindset that drives you to pursue solutions rather than waiting for the answers to come to you.
  • Attention to detail in work and ability to identify ambiguities in specifications.
  • Exceptional written and verbal communication skills, especially when communicating trade-offs between technical decisions to non-technical colleagues.
  • Flexibility to work and maintain focus in an evolving environment. Ability to let go of previous projects and move on to new ones or to dig deeper into existing projects and grow them depending on the business needs.

Benefits

What We Offer:

  • Mission Driven - Some companies use AI to serve better digital ads and trade stocks, we seek to make our communities more resilient.
  • Top Compensation - Competitive compensation package.
  • Best in Class Medical Coverage - 100% benefits and premiums paid.
  • Prime NoHo Location - Our office sits in the heart of NYC’s historic NoHo district and is just minutes away from the BDFM, 6, and RW subway lines.
  • Health Perks - Gym reimbursement and Citibike membership.
  • Strong Culture - collaborative office focused on teamwork, humility, and hustle.
  • Catered lunch on Thursdays, plus a kitchen filled with snacks and drinks.

We're an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.

Personal information
Your Profile
Application Details
8+ years of software development experience focused on web technologies including significant production work with Python
3+ years of experience designing, building and maintaining enterprise data pipelines and/or warehouses
Demonstrable knowledge of big data databases such as columnar data stores (e.g. Cassandra or BigTable) or Hadoop as well as SQL (MySQL, MSSQL or PostgreSQL) and ability to select the right tool for the job
Experience with queued work management and message processing (e.g., Kafka, RabbitMQ)