Send Us A Resume.

We'd Love to hear from you

Agentis is a Chicago-based data analytics and
SaaS platform provider that empowers energy
providers and their business customers to
optimize energy consumption. Our SaaS customer
engagement platform has been deployed to over
2 million businesses nationwide and growing.
We combine our strengths of data visualization,
data science, and software development to create
compelling and powerful customer experience.
At the intersection of Clean Tech and Information
Technology Agentis has an exciting and compelling
mission with a very
passionate and talented team leading the charge.

Interested in making a positive impact and creating great technology?
We're looking for a Data Engineer to be part of our growing core team.
As Data Engineer you will play an important role in helping to develop a highly scalable SaaS application. You must have solid communication skills, be detail oriented,
and show leadership and an exemplary work ethic.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and big data technologies using AWS servers and technologies

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader

Work with data and analytics experts to strive for greater functionality in our data systems

Required Skills

We are looking for a candidate with 3+ years of
experience in a Data Engineer role, who has
attained a Bachelors and/or Graduate degree in
Computer Science, Statistics, Informatics,
Information Systems or another quantitative
field. They should also have experience using
some of the following software/tools: big data
tools, such as Hadoop, Spark, Kafka, etc;
relational SQL and NoSQL databases, such as
MySQL, Postgres, Cassandra, HDFS; data pipeline
and workflow management tools such as Azkaban,
Luigi, Airflow, etc; AWS cloud services;
object-oriented scripting and/or programming
languages such as PHP, Python, Java, C++, Scala,
etc.

Advanced working SQL knowledge and experience working with and optimizing relational databases as well as working familiarity with a variety of databases.

Experience building and optimizing data pipelines, architectures and data sets.

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.