Sr. Data Engineer
Ping Identity | Information Systems (8120) | Denver, CO
At Ping Identity, we're changing the way people think about enterprise security technology. With our innovative Identity Defined Security platform, we're helping to build a borderless world where people have total freedom to work wherever and however they want. Without friction. Without fear.
We're headquartered in Denver, Colorado, and we have offices and employees around the globe. And we serve the largest, most demanding enterprises worldwide, including over half of the Fortune 100. Because even in the most complex enterprise environments, security shouldn't be a source of anxiety. It should be one of your greatest competitive advantages.
We call this digital freedom. And it's not just something we provide our customers. It's something that drives our company. People don't come here to join a culture that's build on digital freedom. They come to cultivate it.
The Senior Data Engineer will have the opportunity to play a critical role in the early stages of developing Ping’s Business Intelligence and Data Analytics capabilities. This individual will collaborate with other teams across the organization (e.g., Product Management, Engineering, Sales, and Finance) to gain a quick understanding of Ping products, business process areas, and/or technologies, and build the analytical framework to enable the business to better understand and leverage complex data sets. The Senior Data Engineer will be expected to architect and build core datasets, implement efficient ETL processes, and own the design, development and maintenance of critical metrics, reports, analyses, dashboards, etc. to drive key business decisions. In addition, this individual must be able to adapt and thrive in a fast-paced and changing business and technical environment.
- Architect, build, and support the operation of enterprise data and analytical infrastructure and tools
- Design robust, reusable, and scalable data driven solutions and data pipeline frameworks to automate the ingestion, processing and delivery of both structured and unstructured batch and real-time streaming data
- Build data APIs and data delivery services to support critical operational processes, analytical models and machine learning applications
- Assist in the selection and integration of data related tools, frameworks, and applications required to expand platform capabilities
- Understand and implement best practices in the management of enterprise data, including master data, reference data, metadata, data quality and lineage
- Develop and prepare strategies for Business Intelligence processes for the organization
- Manage and customize all ETL processes as per customer requirement and analyze all processes for same
- Perform assessment on all reporting requirements and contribute to the development of a long-term strategy for various reporting solutions
Coordinate with data generator and ensure compliance to all enterprise data model according to data standards
- Bachelor’s degree in computer science, math, engineering, or relevant technical field
- 4+ years of collective experience in the application of data engineering, data analytics, data warehousing, business intelligence, database administration, and data integration concepts and methodologies
- 3+ years of experience architecting, building, and administering big data and real-time streaming analytics architectures in on-premises and cloud environments
- 3+ years of experience with execution of DevOps methodologies and continuous integration/continuous delivery
- Object-oriented/object function scripting languages: Python, R, C/C++, Java, Scala, etc.
- SQL, relational databases and NoSQL databases
- Data integration tools (e.g. Talend, SnapLogic, Informatica) and data warehousing / data lake tools
- API based data acquisition and management
- MSSQL, PostgreSQL, MySQL, etc. - MemSQL, CrateDB, etc.
- Business intelligence tools such as Tableau, PowerBI, Zoomdata, Pentaho, etc.
- Data modeling tools such as ERWin, Enterprise Architect, Visio, etc.
- Data integration tools such as Boomi, Pentaho, Talend, Informatica, SnapLogic, etc.
- Familiarity with cloud-based data engineering (AWS, GCP, or Azure)
- Familiarity with data science techniques and frameworks
- Creative thinker with strong analytical skills
- Ability to work in a team environment
- Strong technical communication skills
- Ability to prioritize work to meet tight deadlines
- Ability to learn and keep pace with the latest technology advances and quickly grasp new technologies to support the environment and contribute to project deliverables
- Experience with advanced analytics and machine learning concepts and technology implementations
- Experience in a fast-paced, ever-changing and growing environment