Big Data Implementation Engineer

Dremio

Remote / United States of America
  • Job Type: Full-Time
  • Function: IT
  • Post Date: 02/15/2021
  • Website: dremio.com
  • Company Address: 883 N Shoreline Boulevard C100, Mountain View, CA, 94043

About Dremio

Dremio is the Data Lake Engine. Created by veterans of open source and big data technologies, and the creators of Apache Arrow, Dremio is a fundamentally new approach to data analytics that helps companies get more value from their data, faster. Dremio makes data engineering teams more productive, and data consumers more self-sufficient. For more information, visit www.dremio.com.

Job Description

Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market.
 
If you, like us, say “bring it on” to exciting challenges that really do change the world (no BS), we have endless opportunities where you can make your mark.
 
 
About the Role
 
As a Dremio Big Data Implementation Engineer, your mission is to do what it takes to help our customers rapidly obtain value from the Dremio platform after the sale.  This can include hands-on work to install, size, and tune the software, work with customer’s IT and Data Engineers to connect Dremio to customers’ data sources, create custom integrations and product extensions when needed, and perform training to Customers’ IT, data engineers, data scientists, and end users on how to best use Dremio for their unique workflows.
 
 
Essential Job Functions
  • Application Architecture: tailor reference architectures to provide performance, resilient, scalable, secure, distributed architectures which can exceed SLAs and are easily maintained.
  • Data Modeling: design and test data models which are pragmatic, performant, and scalable.
  • Application Implementation: show the customer how to write robust and scalable data access code that correctly utilizes drivers.
  • Infrastructure: help the customer select on-premise hardware or cloud instance types based on workload and capacity assessments.
  • Production Operations: help the customer deploy, configure, secure, maintain, monitor, scale, troubleshoot, test and tune Enterprise clusters.
  • Instructive Consulting: in response to customer questions, provide ad-hoc instruction that is shaped by the customer’s specific environment, requirements and constraints.
  • Identifying Needs: bring in the sales team when the customer will benefit from a services or license purchase / expansion.
  • A Team Effort: collaborate with other roles in Dremio Professional Services, Technical Support, pre-Sales, Sales, etc. to ensure our customer is successful.

Skills & Qualifications

  • In-depth, current, hands-on knowledge of SQL, particularly advanced SQL techniques used in analytic contexts, is a must.
  • Data architecture, data modeling, warehousing concepts.  Deep understanding of data architectures which support reporting and analytics use cases.
  • Experience with ETL concepts, techniques, pipelines, tools, challenges
  • Performance techniques for accelerating Reporting / Dashboarding use cases (OLAP, Materialized Views, etc)
  • Hands-on experience with the Hadoop Ecosystem ( YARN, Zookeeper)
  • Hive / Impala / Spark / etc
  • AWS/Azure/GCP
  • Cloud performance considerations
  • Kubernetes / AKS / EKS
  • Excellent written and verbal communications skills
  • Ability to present on technical topics to various levels within the enterprise

Related Jobs

Site Reliability Engineer

Dremio - Santa Clara, CA, US

Software Engineer - Data Management & Processing

Dremio - Hyderabad, IN

Software Engineer - Dremio SaaS Service

Dremio - Hyderabad, IN

Software Engineer - Query Optimizer

Dremio - Santa Clara, CA, US

Outbound Product Manager

Dremio - Remote
Disclaimer: Local Candidates Only
This company does NOT accept candidates from outside recruiting firms. Agency contacts are not welcome.