We are looking for the Data Quality Engineer, the developer of a data quality layer in the large scale data infrastructure.
Data Quality Engineer creates manual and automated tests for monitoring and testing data in the warehouse. The database we use is https://clickhouse.com/ , which has standard support of SQL. Data test creation involves investigating blockchain data structures, invariants, consensus and protocols. Automated scripts that can fix the data in automatic or semi-automatic mode helps to ensure data quality in a continuous process.
Metric of success is the test coverage over the data, number of client complaints on data quality. These metrics are achieved by automating tests, scripts and any procedures, as with a big data warehouse operating it in the manual manner can not be efficient.
Role & Responsibilities:
- Design and implement data quality layer in the large scale data infrastructure;
- Execute a data quality testing framework to validate data at various stages of the processing lifecycle;
- Hands-on with test preparation and execution in the Agile and DevOps environment;
- Become familiar with Bitquery’s blockchain products, data sets, and processing pipelines;
- Coordinate with subject matter experts to develop, maintain, and validate test scenarios;
- Meet with internal stakeholders to review current testing approaches and provide feedback on ways to improve/extend / automate;
- Preparation, review and update of test cases and relevant test data consistent with the system requirements, including functional, integration & regression
- Analyze, debug, and document quality issues
- Record and report test status at the respective stages
- Be proactive and follow Shift-left testing approach in identifying the issues early and following-up on them efficiently
- Support quality assurance initiatives and help institutionalize best practices
- Maximize the opportunity to excel in an open and recognising work culture. Be a problem solver and a team player to make a bigger contribution to the achievements.
- Open to learn from each other in the team and each experience day-to-day
Requirements
- Min. 5+ years of hands-on with databases
- Experience working with Big data products
- Good knowledge of SQL, data warehousing, data analytics, APIs, etc.
Expectations
- We expect this person will create SQL and API tests, automate the execution and track data issues
- Success metrics are number/ severity of issues identified before customer noticed them
Big plus will be experience with:
- QA Test Automation
- Blockchain technologies
- Blockchain on-chain datasets
- Different automation frameworks
- Data quality, tests and automation
- ETL processes and pipelines
- Real-time systems and data pipelines
Professional Approach
- Passionate to investigate data, explore information and learn blockchain protocols;
- Ready to work in flexible working hours as per the requirement;
- Good communication skills (written, verbal, listening, and articulation)
- Ability to work as an Individual Contributor as well as in a team environment and success with meeting deadlines under pressure
Benefits
- 100% Remote Policy (Work from anywhere in the world)
- We don't track leaves (Responsibility driven culture)
- Opportunity to work & collaborate with a truly global team spread across 5 countries
- Choose your own work hours
- Yearly trip with Bitquery team to any remote destination
- A promise to finish the interview processes within 1-2 weeks
Being a startup we take decisions & move fairly fast, while giving candidates great experience during the interview process. We have a flat hierarchy in the organisation where we empower individuals and provide them an opportunity to deliver results as per their working style. Come and join a great culture, with a chance to build Bitquery with us.