Lone Wolf Technologies | Development | Dallas, TX
Lone Wolf is looking for a leader to drive processes, make data-driven decisions for our team, and pioneer a revolutionary update between our Product and R & D organizations. The ideal candidate will have experience working on backend data pipelines and be comfortable navigating to remote VMs and using containerization technology. They will be expected to build scalable data ingress and egress pipelines across products and clouds, deploy new data transformation pipelines and diagnose, troubleshoot and improve existing data architecture. If you are ready to make pivotal changes, measurable improvements, and continued growth in an industry-leading organization, this is a great opportunity for you.
Create and deploy ETL pipelines on our cloud architecture.
Troubleshoot and fix bugs and diagnose data issues
Work independently to solve issues
Write unit and integration tests
Provide training to junior team members on areas of expertise
- EDUCATION: BA/BS degree preferred in related field but not required
- EXPERIENCE: 6-10 years in Data Engineering or directly related role
- CERTIFICATIONS: AWS Certifications preferred, but not required
Experience writing ETL pipelines on a cloud infrastructure, especially in Python
Solid understanding of SQL and RDBMS
Comfortable operating Unix/Linux from the command line
Solid understanding of distributed data processing (Hadoop, map/reduce, ect)
Evaluating and deploying ML models
Google Cloud or AWS experience
Familiarity with Docker and Kubernetes
While this role is posted in Dallas, we are open to considering candidates in all parts of the United States and Canada.