Software Engineering - Quality Engineering

Own Company

Own Company

Software Engineering, Quality Assurance

Bengaluru, Karnataka, India

Posted on Apr 24, 2026

Description

We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good– you’ve come to the right place.

Role Description:

This role is for a Tableau Core Quality (Q3) Engineering Team that helps build high-quality analytics that is mission-critical for our customers’ and partners’ businesses. Our goal is to ensure the best quality product gets delivered to our Customers on-time and everytime. We are looking for multiple Q3 Engineers (MTS) with strong communications and inter-personal skills and with a self-starter attitude to help us build world class Analytics Products.

As a Q3 engineer you will collaborate with cross-functional teams and will influence the design and drive our # value Trust across teams. You will need to have hands-on programming experience (Java, Python) and need to know how to produce a good test plan/spec. You will help the organization develop customer centric quality aware culture. You will be responsible for proposing and developing the tools and testing frameworks requirements, to improve engineers productivity in environment setup, automation testing. You will build, enhance, monitor and maintain necessary test automation infrastructure to support this effort.

Responsibilities:

  • Deep dive into customer issues and investigations to identify root cause and help with solving the problem.
  • Analyze and build prevention plans to reduce customer issues.
  • Interface with customers to understand real life usage, build test cases
  • Build tools, frameworks, infrastructure to improve product quality
  • Implement testing standards and guidelines for specified testing approaches for data pipeline, ELT dataflows and data visualizations
  • Lead initiatives to boost tooling, automation and integration speed focusing on developer productivity, release velocity and product quality
  • Execute Testing with complex org setup, varied data shapes
  • Monitor product and/or feature-level quality health metrics (testability, test health, test coverage, etc)
  • Be responsive to bugs and prioritized customer investigations.
  • Work closely with cross-functional teams across geographies.

Required Qualifications:

  • BS or MS in Computer Science, or related technical discipline, or equivalent practical experience.
  • Excellent interpersonal and communication skills
  • Self-starter, who can work independently, able to learn quickly, meets deadlines and demonstrates problem-solving skills.
  • Deep knowledge of object-oriented programming and other scripting languages: Java, Python, C#.; Deep understanding of software development best practices
  • 3+ years of experience in testing analytical applications, data visualizations, data pipelines, and machine-learning models
  • Experience with cloud technology including AWS, Azure and/or GCP
  • Familiarity with Database Services in the Cloud
  • Proficient in writing functional & End to End test automation (UI/API) , Selenium etc
  • Experience working with large datasets including data modeling, logical schema design, ETL/ELT, and developing data pipelines for structured, semi-structured and unstructured data
  • Languages/Frameworks: Java, Python, JavaScript
  • Technologies/Tools:Selenium, Git, TeamCity, Linux, etc
  • Experience with data query languages (e.g. SQL, Pig, etc.) and understanding of relational/ columnar.
  • Experience in product development companies preferred
  • Ability to manage assigned projects, meet deadlines, and adapt to changing priorities

Nice to Have Qualifications:

  • Experience working with large datasets including data modeling, logical schema design, ETL/ELT, and developing data pipelines for structured, semi-structured and unstructured data
  • Experience with cloud technology including GCP, AWS and/or Azure
  • Familiarity with Machine learning & AI assisted tooling (Cursor, MCP, CoPilot)
  • Development experience would be an advantage