Big Data and Analytics Cloud Engineer

Location:  Dallas, TX

Our client is a $30B+ global company that holds the #2 market position in their industry and is committed to a major transformation that includes a move to be a very nimble, open source development, and cloud enabled organization as well as centralizing and growing their cybersecurity efforts.  Our client is currently seeking a senior level Big Data Engineer to assume the lead role on the team and will oversee and develop the various Big Data capabilities in cloud, aligning them with our client’s business strategies and requirements.

  • Architect solutions for massive scale, resiliency and maintainability, leveraging various cloud providers, which meet technical, security, and business needs for applications and workloads.
  • Champion good engineering practices and help teams to define and setup frameworks for “Big Data As A Service”.
  • Contribute to technology strategy and engineering roadmaps around “Big Data In Cloud” platforms and execute strategic engineering proof of concepts.
  • Develop monitoring strategies for infrastructure, platforms and applications aligning with enterprise strategy and overall industry trends.
  • Champion the appropriate use of open source and commercial technology based upon industry trends and innovative concepts.


  • Degree in Computer Science, MIS, or related area, or equivalent work experience.
  • 5+ years of relevant senior level experience in infrastructure, analytics or solution design / architecture.
  • Demonstrable knowledge of Amazon Web Services or similar cloud computing platform.
  • Good understanding of Linux – preferably RHEL.
  • Technical leadership and solution design.
  • Hands-on style – willingness and competence in producing necessary changes in our infrastructure and processes.
  • Able to work effectively across organizational and geographical boundaries
  • Ability to clearly communicate ideas and solutions.
  • Demonstrable ability to learn new technologies quickly.

 Desirable Traits

  • Big Data Technologies such as Hadoop, EMR, Spark, Impala, Kafka, etc.
  • Data Warehousing, SQL, Relational Databases.
  • NoSQL Databases.
  • Integration, Dataflow management, ETL.
  • Storage – NAS, SAN, JBOD, Object Storage.
  • Automation, Configuration Management (e.g. Ansible, Puppet), Dev-ops practices, CI/CD pipelines (e.g Jenkins).
  • Elementary networking skills, switching, routing, firewalls, load balancing.
  • Development skills – Java, Scala, Python, PERL, Shell Script.
  • Linux Containers / Docker.
  • Workflow scheduling and management.
  • DR and business continuity planning.

Contact Information:

For immediate consideration please email a resume to