Home
Jobs

10314 Etl Jobs - Page 37

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

20.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Overview: Job Title: Senior QA Engineer Location: Bangalore Position Type: Full-time Position Level: 3 WHO WE ARE Xactly is a leader in Sales Performance Management Solutions and a part of Vista Equity Partners portfolio companies since 2017. The Xactly Intelligent Revenue Platform helps businesses improve go-to-market outcomes through increased collaboration, greater efficiencies, and connecting data from all critical functions of the revenue lifecycle on a single platform. Born in the cloud almost 20 years ago, Xactly provides customers with extensive experience in solving the most challenging problems customers of all sizes face, backed by almost 20 years of proprietary data and award-winning AI. Named among the best workplaces in the U.S. by Great Place to Work six times, honored on FORTUNE Magazine’s inaugural list of the 100 Best Workplaces for Millennials, and chosen as the “Market Leader in Incentive Compensation” by CRM magazine. We’re building a culture of success and are looking for motivated professionals to join us! THE TEAM: Xactly’s QE team is a rapidly growing & very well diversified team with a very strong focus on cutting-edge test automation tools & technologies. We are a very strong team of 35+ members spread across our engineering centers in San Jose, Denver and Bangalore (India). All engineers in the QE team are encouraged to operate independently and with highest levels of accountability. Each QE engineer works with a tight-knit team of back-end developers, front-end developers, Product Managers in the scrum teams with laser focus on producing high quality code & products for our customers. All QE engineers are trained well on all aspects i.e. Products training, Automation tools & infrastructure, CI/CD etc. ensuring their success in scrum teams. Xactly QE team members work with the cutting edge tools & technologies like Selenium Web Driver, JAVA, TestNg, Maven, RestAssured, Jenkins, Docker, Kubernetes, Harness, Snowflake, Terraform , Jmeter to name a few. THE OPPORTUNITY: As a Senior QA Engineer at Xactly Corporation, you will maintain/continuously improve upon the QE function and facilitate implementation of QE best practices within the organization. Establish partnerships with internal stakeholders to understand customer requirements and ensure quality of delivery. Own, drive, measure and optimize the overall quality of the development and delivery process. Drive quality automation and take the customer perspective for end to end quality. At Xactly, we believe everyone has a unique story to tell, and these small differences between us have a big impact. When bright, diverse minds come together, we’re challenged to think different ways, generate creative ideas, be more innovative, and take on new perspectives. Our customers come from different cultures and walks of life all around the world, and we believe our teams should reflect that to build strong and lasting relationships. THE SKILL SETS :· Experience of 5-8 years with strong automation testing skills. Strong testing skills with ability to develop test strategy, design test plan, and test cases effectively and independently. Strong experience in GUI automation (such as Selenium) and API automation (such as JUnit) using off the shelves tools Experience in testing enterprise J2EE business applications. Strong SQL query knowledge in Postgresql or Oracle database. Experience in Mabl Testing tool is a plus point. Strong Experience as QA engineer in Scrum methodology requiring automated tests as definition of done Programming experience in language such as Java Experience in product based companies NICE-TO-HAVE SKILLS (ALL OTHER SKILLS CAN BE ADDED HERE) Working on a team in a SAFe Portfolio and ART Exposure on ETL/analytics modules Exposure on builds and deployments tools like Jenkins Exposure to build and deployment tools like Harness/Jenkins & Maven WITHIN ONE MONTH, YOU’LL: Attend New Hire Training Learn the Dev and QE processes Participate in scrum development process Get to know your team WITHIN THREE MONTHS, YOU’LL: Learn Xactly’s SaaS technology stack To gain complete domain and Xactly Product knowledge. Taking ownership of a module/project Perform code reviews WITHIN SIX MONTHS, YOU’LL: Ensure best QE practices are being used Working on multiple functionalities and taking ownership of respective module automation Perform RCA’s on Production Escapes and ensure corrective actions are implemented WITHIN TWELVE MONTHS, YOU’LL: Help grow other engineers technically by pairing and developing other learning opportunities. Training new joiners and peers in automation. Continuously work on QE process improvements to maximize team effectiveness and efficiencies BENEFITS & PERKS Paid Time Off (PTO) Comprehensive Health and Accidental Insurance Coverage Tuition Reimbursement XactlyFit Gym/Fitness Program Reimbursement Free snacks onsite(if you work in office) Generous Employee Referral Program Free Parking and Subsidized Bus Pass (a go-green initiative!) Wellness program OUR VISION: Unleashing human potential to maximize company performance. We address a critical business need: to incentivize employees and align their behaviors with company goals. OUR CORE VALUES: Customer Focus | Accountability | Respect | Excellence (CARE) are the keys to our success, and each day we’re committed to upholding them by delivering the best we can to our customers. Xactly is proud to be an Equal Opportunity Employer. Xactly provides equal employment opportunities to all employees and applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, pregnancy, sexual orientation, or any other characteristic protected by law. This means we believe in celebrating diversity and creating an inclusive workplace environment, where everyone feels valued, heard, and has a sense of belonging. By doing this, everyone in the Xactly family has the power to make a difference and unleash their full potential. We do not accept resumes from agencies, headhunters, or other suppliers who have not signed a formal agreement with us.

Posted 3 days ago

Apply

7.0 - 9.0 years

6 - 7 Lacs

Bengaluru

On-site

GlassDoor logo

7 - 9 Years 7 Openings Bangalore Role description Work closely with business stakeholders to understand their needs, objectives, and challenges. Elicit, document, and analyze business requirements, processes, and workflows. Translate business requirements into clear and concise functional specifications for technical teams. Collaborate with technology teams to design solutions that meet business needs. Propose innovative and practical solutions to address business challenges. Ensure that proposed solutions align with the organization's strategic goals and technological capabilities. Identify areas for process optimization and efficiency enhancement. Recommend process improvements and assist in their implementation. Must have very good knowledge on Health care domain and SQL. Good to have AWS and Snowflake technologies. Hands on Complex SQL queries (Snowflake) Knowledge of database management systems, both relational and non-relational Familiarity with data integration and ETL tools. Skills Business Analysis, Business System Analysis, SQL, Snowflake, SDLC, US Healthcare domain, Strong communication & documentation About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 days ago

Apply

7.0 years

4 - 6 Lacs

Bengaluru

On-site

GlassDoor logo

About Us Automation Anywhere is a leader in AI-powered process automation that puts AI to work across organizations. The company’s Automation Success Platform is powered with specialized AI, generative AI and offers process discovery, RPA, end-to-end process orchestration, document processing, and analytics, with a security and governance-first approach. Automation Anywhere empowers organizations worldwide to unleash productivity gains, drive innovation, improve customer service and accelerate business growth. The company is guided by its vision to fuel the future of work by unleashing human potential through AI-powered automation. Learn more at www.automationanywhere.com Our opportunity: As a Data Engineer, you’ll partner with business, finance, and IT to build BI products and services for reporting, data visualization, and ad hoc analytics. You’ll help to bring alignment to Automation Anywhere’s data model and formalize its data dictionary. You’ll also help to establish the necessary governance to manage and maintain the Company’s data model, dictionary, and lineage. You’ll partner with IT teams who own core data infrastructure, security, privacy services that BI platform will rely on. You will make an impact by being responsible for: Delivering business capabilities through technology enablement including data warehousing, data analytics & visualizations. Leading data workstreams with cross-functional business stakeholders like Sales, Marketing, IT, Product & Finance Partnering with cross-functional teams to drive BI and analytics initiative Effective communication of project scopes, timing, prioritization, budgets, resource needs, and progress on an ongoing basis Driving development and delivery of reporting features, analytics, unit economics, and KPIs through sprint planning, and production releases Helping to design and build schemas that will underpin BI data warehouse Developing and maintaining governance structure over data model and data dictionary Ensuring lineage of data used in the production of unit economics, KPIs, and other financial and nonfinancial metrics Driving adoption of BI platform and features through the organization of demo days, user training sessions, launch updates, etc. You will be a great fit if you have: BS in computer/data science degree or equivalent Minimum 7 years of engineering experience delivering solutions/products related to data infrastructure at scale Strong SQL skills and experience working with relational databases and distributed systems Experience with at least one programming language for data analysis (e.g. Python or R) Experience with an ETL frameworks, data modeling and data architecture Experience with data visualization platforms like Tableau Hands-on, end-to-end understanding of analytics from descriptive to predictive analytics including machine learning across data ingestion, preparation, development, and end user adoption You excel in these key competencies: Solid interpersonal skills with proven ability to develop and maintain effective business partner relationships Demonstrated track record of collaborating with high performing cross-functional teams Demonstrated ability to think strategically framing business issues to data warehousing solutions Demonstrated ability to communicate complex analyses clearly and concisely to leadership team via presentation Demonstrated ability to manage multiple tasks and projects, prioritize, and adapt to a changing environment Must be willing to roll up sleeves and get work done - a productive contributor to detailed content, with a strong sense of accountability and ownership Why Automation Anywhere? At our company each person brings their unique talents to work as a team and make a difference. As the leader in Robotic Process Automation (RPA), we provide a very compelling product where our teams are breaking new ground every day and given an environment to grow their skills and have fun along the way. Our technology is the game changer, and our people give us the edge to better our world and go be great! All unsolicited resumes submitted to any @automationanywhere.com email address, whether submitted by an individual or by an agency, will not be eligible for an agency fee.

Posted 3 days ago

Apply

5.0 - 7.0 years

3 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

5 - 7 Years 3 Openings Bangalore Role description Work closely with business stakeholders to understand their needs, objectives, and challenges. Elicit, document, and analyze business requirements, processes, and workflows. Translate business requirements into clear and concise functional specifications for technical teams. Collaborate with technology teams to design solutions that meet business needs. Propose innovative and practical solutions to address business challenges. Ensure that proposed solutions align with the organization's strategic goals and technological capabilities. Identify areas for process optimization and efficiency enhancement. Recommend process improvements and assist in their implementation. Must have very good knowledge on Health care domain and SQL. Good to have AWS and Snowflake technologies. Hands on Complex SQL queries (Snowflake) Knowledge of database management systems, both relational and non-relational Familiarity with data integration and ETL tools. Skills Business Analysis, Business System Analysis, SQL, Snowflake, SDLC, US Healthcare domain, Strong communication & documentation About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 days ago

Apply

0 years

7 - 9 Lacs

Bengaluru

On-site

GlassDoor logo

Oracle is seeking energetic and highly skilled Integration Services Contact Center Software Engineers to join our Cloud Based Contact Center (CCaaS) project. In this critical role, you'll partner closely with Technical Architects, Business Architects, Business Analysts, and Development and Testing Leads to build cutting-edge customer engagement solutions across voice and digital channels. This position is ideal for individuals passionate about solving complex problems and contributing to a mission-critical, Tier-0 service that orchestrates communication between customers and agents. You'll be instrumental in developing capabilities for support services, sales inquiries, and marketing campaigns for over 15,000 agents across 80 Oracle Business Units. Key Responsibilities Web Service : Create reliable, secure, and high-performance REST APIs by applying industry best practices. This includes implementing strong authentication and authorization protocols, utilizing caching and pagination strategies, and making the APIs available for integration and consumption by other teams. cloud native service Development : Design and develop cloud-native microservices using Python, Node.js, and Oracle’s Helidon framework (Java). This includes implementing backend database queries, leveraging caching solutions like Coherence, and deploying containerized applications using Kubernetes. Utilize OCI services such as API Gateway, Vault, and Object Storage to ensure seamless and secure service deployment. Experience in building High level and Low level design documents and diagrams using visio or confluence is important. Backend Integration : Design and develop backend logic to integrate with CRM and CCaaS APIs. This includes working with REST APIs, webhooks, and WebSocket frameworks to ensure seamless data exchange and functionality between systems. Database : Proficient in writing and optimizing complex SQL queries using Oracle Database. Experience with data extraction, transformation, and loading (ETL) processes. Skilled in using joins, subqueries, views, indexes, and PL/SQL procedures to manage large datasets. Familiar with Oracle tools such as Oracle SQL Developer. Strong understanding of query tuning and performance optimization. Frontend Development: Design and develop frontend application using Html, Node JS, jQuery, ajax and Oracle visual builder cloud service Process & Standards Management : Collaborate with other developers to maintain consistency and continuously improve development processes and standards within the SDLC lifecycle, including working with Git repositories. Troubleshooting & Support : Actively troubleshoot complex issues by analyzing logs to determine root causes, providing support for developed features. Preferred Qualifications & Experience We're looking for individuals with good problem solving and communication skills, extensive software development experience, demonstrating strong proficiency in JAVA, J2EE, Python, Node JS, Kubernetes, and Web Services (SOAP, REST). Ideal candidates will have proven expertise in strong understanding of HLD and LLD practices, developing complex applications using backend and front-end technologies. Knowledge of Kubernetes, DevOps CI/CD Pipelines are important, Knowledge of contact center domain (CCaaS), covering Inbound and Outbound feature, is added advantage Development Tools & Methodologies: Candidates should possess hands-on experience with Java IDEs like JDeveloper, Eclipse, NetBeans, or IntelliJ IDEA, Web Application security, Authentication methodologies, and SSO technologies, alongside proficiency in Contact Center like Zoom/Genesys API’s. They must have demonstrable experience with test-driven development, automated test frameworks, mocking/stubbing, and JUnit, coupled with a strong understanding of implementing Java best practices for scalability and performance in high-volume environments. Experience with Agile development methodologies is also essential. Platform & Cloud Proficiency: A strong knowledge of either the Zoom or Genesys Contact Center Platform (including its Framework, Voice, Digital, and Outbound components) is a significant advantage. Familiarity with Cloud platforms (e.g., OCI, AWS, Azure) and cloud-native development concepts is highly preferred. As part of a software project implementation team, you will provide technical expertise in identifying, evaluating, and developing systems and procedures that are cost-effective and meet user needs within the contact center domain. This includes configuring system settings and options; planning and executing unit, integration, and acceptance testing; and creating specifications for systems to meet business requirements. You will design the intricate details of automated systems and may provide consultation to users in automated systems. This role may also involve leading cross-functional linked teams to address complex business or systems issues. You will provide leadership and expertise in evaluating and developing complex business problems, frequently operating at the leading edge of technology. Recommending and justifying major changes to existing automated systems will be a key aspect of your contribution.

Posted 3 days ago

Apply

0.0 - 1.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Indeed logo

We are looking for a skilled and detail-oriented Power BI Developer to join our team. The ideal candidate will have hands-on experience in designing and developing insightful, high-performance dashboards, reports, and KPIs using Microsoft Power BI. You will play a key role in transforming data into valuable business insights and delivering top-tier analytics solutions to clients. Key Responsibilities: Design and develop interactive reports, dashboards, and KPI scorecards using Power BI. Strong UI/UX design skills tailored for executive dashboards Experience in Power BI layout, theming and custom visuals Experience is defining and organizing measures and dimensions to ensure clarity and consistency Ability to create visually appealing , responsive and intuitive dashboards. Experience of migrating dashboards from QlikView to Power BI Build tabular and multidimensional models aligned with data warehouse standards. Develop Analysis Services (SSAS) reporting models. Connect to various data sources, import and transform data using Power Query/M , DAX , and Power BI tools. Implement row-level security and understand application security models within Power BI. Perform advanced DAX calculations for data manipulation and insight generation. Optimize Power BI performance and troubleshoot dashboard/report issues. Translate business needs into data-driven reports and visual storytelling . Ensure data governance, quality, and security best practices. Document design methodology, technical specifications, and project deliverables. Engage with business stakeholders to gather requirements and deliver analytics solutions. Analyze large datasets and present actionable insights to client teams. Mandatory Skills & Experience: 3 + years of experience in Power BI report development and BI roles. Knowledge of other tools like Qlikview and migration to Power BI Strong knowledge of SQL , query performance tuning, and data modelling . Proven experience with Microsoft BI Stack - Power BI, SSAS, SSRS, SSIS. Solid understanding of relational and multidimensional database design . Familiarity with the full Power BI ecosystem: Power BI Premium, Power BI Service, Power BI Server, Power Query , etc. Knowledge of presentation tools and the ability to create executive-level presentations . Strong analytical thinking, problem-solving , and attention to detail. Excellent communication skills for stakeholder interaction and requirements gathering. Comfortable creating reports from wireframes and functional requirements . Strong business acumen and ability to derive insights that incite action. Nice to Have (Preferred): Understanding of ETL processes , data pipeline architecture , and data warehousing on Azure. Experience in distributed systems for data extraction, ingestion, and processing at scale. Soft Skills: Resilient under pressure and deadlines. Proactive, self-driven attitude with strong ownership. Excellent team collaboration and communication. Client-focused mindset with a desire to grow in a dynamic environment. Job Type: Full-time Pay: ₹600,000.00 - ₹1,000,000.00 per year Schedule: Day shift Ability to commute/relocate: Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Post selection, can you join immediately or within 30 days? Experience: Power BI: 2 years (Required) QlikView: 1 year (Preferred) Work Location: In person

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

QlikSense Developer Location: Remote Required Skills and Qualifications: • Experience: 5+ years of hands-on experience with QlikSense development, including dashboard creation, scripting, and data modeling. o Proficiency in QlikSense scripting and data modeling. o Strong SQL skills for data analysis and manipulation. o Experience with QlikSense NPrinting o Experience in Qlik GeoAnalytics, and Qlik APIs is a plus. o Familiarity with other BI tools (e.g., Tableau, Power BI) is a bonus. o Strong understanding of data warehousing and ETL concepts. • Soft Skills: o Excellent problem-solving and analytical skills. o Strong communication and stakeholder management abilities. o Ability to work independently and as part of a team. Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Morgan Stanley Senior Platform Engineer - Vice President - Software Production Management & Reliability Engineering Profile Description We’re seeking someone to join our team as (Vice President) Systems Engineer’s role for stability, integrity, and efficient operation of the in-house and 3rd party systems that support core organizational functions. This is achieved by monitoring, maintaining, supporting, and optimizing all networked software and associated operating systems. The Systems Engineer will apply proven communication, analytical, and problem-solving skills to help identify, communicate, and resolve issues in order to maximize the benefit of IT systems investments. This individual will also mentor and provide guidance to the Systems Engineer staff. Investment_Management In the Investment Management division, we deliver active investment strategies across public and private markets and custom solutions to institutional and individual investors. This is Vice President position that oversees the production environment, ensuring the operational reliability of deployed software, and implements strategies to optimize performance and minimize downtime. Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses for those who show passion and grit in their work. Interested in joining a team that’s eager to create, innovate and make an impact on the world? Read on… What You’ll Do In The Role Designing: Responsible for designing and implementing the overall IM Technology platform architecture, ensuring seamless operation and alignment with the organization’s goals, scalability requirements, and industry best practices. Technology Evaluation: Drive innovation by evaluating and selecting appropriate technologies, frameworks, and tools which will maximize the value produced by the IM Technology platform. Guidance and Leadership: Providing technical guidance and leadership to IM Technology team developers throughout the development lifecycle. Liaise between IM Technology groups and End User Technology to triage environmental issues and resolve before they reach production. Scalability and Performance: Ensuring that the platform architecture can scale efficiently and meet performance requirements. Security: Implementing security best practices for infrastructure, such as network security, access control, and data encryption and also ensuring that the platform architecture is resilient to security threats. Working closely with the Security Architecture team to keep our infrastructure aligned with frequent changes to firm-level security policies such as network security, access control and data encryption. Integration: Overseeing the integration of various components and systems within the platform. Also act as a conduit for secure data transfer between critical platform applications and third-party data providers or receivers. Collaborate: Collaborating with product owners, business stakeholders and ITSO’s to ensure the system's architecture supports the organization's goals and objectives. Testing: Writing and executing tests to ensure the reliability, scalability, and performance of the platform. Deployment: Managing the deployment process and ensuring smooth deployments of platform updates and releases. Monitoring and Maintenance: Monitoring the platform for performance issues, bugs, and security vulnerabilities, and addressing them promptly. This includes performing routine preventative maintenance such as system patching, updates and upgrades. Automation: Implementing automation tools and processes to streamline development, deployment, and maintenance tasks. Documentation: Creating and maintaining technical documentation for the platform components and processes. Infrastructure as Code: Managing infrastructure using code-based tools like Terraform or CloudFormation in order to ensure simplicity of the platform, minimization of errors and adherence to Change Management principles. Containerization and Orchestration: Implementing containerization using Docker and container orchestration using Kubernetes or similar tools. Monitoring and Logging: Setting up monitoring and logging solutions to track the performance and health of the platform. This involves ensuring that any logs generated throughout the platform are tracked in firm-approved systems and are secured according to their level of confidentiality. What You’ll Bring To The Role At least 8 years' relevant experience would generally be expected to find the skills required for this role Good working experience in at least some of the below technologies: Middleware (i.e. Tomcat, WebSphere, WebLogic) App containerization (i.e. Kubernetes, Redhat OpenShift) Automation (i.e. UiPath RPA, Microsoft Power, Airflow) Message Queue (i.e. Kafka, MQ) ETL (i.e. Glide) Analytics (i.e. Dataiku) Data Management (i.e Snowflake, Data Bricks) Should have sound knowledge on IT Application architecture, Design methodologies across multiple platforms. Good understanding of applications capacity management Good experience on applications resilience planning Good working knowledge on SQL and scripting. Sound knowledge on multiple operating systems. Flexibility on off-hours and weekends availability. Excellent understanding of the organization’s goals and objectives. Good project management skills. Excellent written, oral, and interpersonal communication skills. Proven analytical and creative problem-solving abilities. Able to prioritize and execute tasks in a high-pressure environment. Ability to work in a team-oriented, collaborative environment. What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About the role: Want to be on a team that full of results-driven individuals who are constantly seeking to innovate? Want to make an impact? At SailPoint, our Data Platform team does just that. SailPoint is seeking a Senior Staff Data/Software Engineer to help build robust data ingestion and processing system to power our data platform. This role is a critical bridge between teams. It requires excellent organization and communication as the coordinator of work across multiple engineers and projects. We are looking for well-rounded engineers who are passionate about building and delivering reliable, scalable data pipelines. This is a unique opportunity to build something from scratch but have the backing of an organization that has the muscle to take it to market quickly, with a very satisfied customer base. Responsibilities : Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses. Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks. Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem. Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems. Develop and maintain workflow orchestration using tools like Apache Airflow. Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes. Organize work from multiple Data Platform teams and customers with other Data Engineers Communicate status, progress and blockers of active projects to Data Platform leaders Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills. Qualifications : BS in computer science or a related field. 10+ years of experience in data engineering or related field. Demonstrated system-design experience orchestrating ELT processes targeting data Excellent communication skills Demonstrated ability to internalize business needs and drive execution from a small team Excellent organization of work tasks and status of new and in flight tasks including impact analysis of new work Strong understanding of python Good understanding of Java Strong understanding of SQL and data modeling Familiarity with airflow Hands-on experience with at least one streaming or batch processing framework, such as Flink or Spark. Hands-on experience with containerization platforms such as Docker and container orchestration tools like Kubernetes. Proficiency in AWS service stack. Experience with DBT, Kafka, Jenkins and Snowflake. Experience leveraging tools such as Kustomize, Helm and Terraform for implementing infrastructure as code. Strong interest in staying ahead of new technologies in the data engineering space. Comfortable working in ambiguous team-situations, showcasing adaptability and drive in solving novel problems in the data-engineering space. Preferred Experience with AWS Experience with Continuous Delivery Experience instrumenting code for gathering production performance metrics Experience in working with a Data Catalog tool ( Ex: Atlan ) What success looks like in the role Within the first 30 days you will: Onboard into your new role, get familiar with our product offering and technology, proactively meet peers and stakeholders, set up your test and development environment. Seek to deeply understand business problems or common engineering challenges Learn the skills and abilities of your teammates and align expertise with available work By 90 days: Proactively collaborate on, discuss, debate and refine ideas, problem statements, and software designs with different (sometimes many) stakeholders, architects and members of your team. Increasing team velocity and showing contribution to improving maturation and delivery of Data Platform vision. By 6 months: Collaborates with Product Management and Engineering Lead to estimate and deliver small to medium complexity features more independently. Occasionally serve as a debugging and implementation expert during escalations of systems issues that have evaded the ability of less experienced engineers to solve in a timely manner. Share support of critical team systems by participating in calls with customers, learning the characteristics of currently running systems, and participating in improvements. Engaging with team members. Providing them with challenging work and building cross skill expertise Planning project support and execution with peers and Data Platform leaders SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law. Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations. Show more Show less

Posted 3 days ago

Apply

35.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Us One team. Global challenges. Infinite opportunities. At Viasat, we’re on a mission to deliver connections with the capacity to change the world. For more than 35 years, Viasat has helped shape how consumers, businesses, governments and militaries around the globe communicate. We’re looking for people who think big, act fearlessly, and create an inclusive environment that drives positive impact to join our team. What You'll Do You are a capable, self-motivated data engineer, proficient in software development methods, including Agile/Scrum. You will be a member of the data engineering team, working on tasks ranging from the design, development, and operations of data warehouses to data platform functions. We enjoy working closely with each other, utilizing an agile development methodology. Priorities can change quickly, but our team members can stay ahead to delight every one of our customers, whether they are internal or external to Viasat. The day-to-day Strong programming experience using Python. Proven track record with 5+ years of experience as a data engineer or experience working on data engineering projects/platforms. Working experience with data pipelines & methodologies. Experience with SQL and a wide variety of databases, like PostgreSQL. Good knowledge & experience in distributed computing frameworks like Spark Good experience with source code management systems like GIT Capable of tuning databases, and SQL queries to meet performance objectives. Bachelor’s degree in computer science, computer engineering, or electrical engineering or equivalent technical background and experience Embracing the DevOps philosophy of product development, in addition to your design and development activities, you are also required to provide operational support for the post-production deployment. What You'll Need Experience Requirement: 5+ years Education Requirement: Bachelor’s degree Travel Requirement: Up to 10% What Will Help You On The Job Experience with cloud providers like AWS, containerization, and container orchestration frameworks like Kubernetes is preferred. Working experience with data warehouses & ETL tools. Capable of debugging sophisticated issues across various ETL platforms, and databases. Experience with DevOps and tools such as Jenkins, and Ansible is an advantage. Experience with small- to mid-sized software development projects. Experience with Agile Scrum is a plus. Understanding of routing, switching, and basic network communication protocol equal opportunity based on EEO Statement Viasat is proud to be an equal opportunity employer, seeking to create a welcoming and diverse environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, ancestry, physical or mental disability, medical condition, marital status, genetics, age, or veteran status or any other applicable legally protected status or characteristic. If you would like to request an accommodation on the basis of disability for completing this on-line application, please click here. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For The Sr Data Engineer is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems or building new solutions from ground up. This role will work with developers, architects, product managers and data analysts on data initiatives and ensure optimal data delivery with good performance and uptime metrics. Your behaviors align strongly with our values because ours do. Some Of What You'll Do Scope of the Role: Direct Reports: This is an individual contributor role with no direct reports Key Responsibilities Create, enhance, and maintain optimal data pipeline architecture and implementations. Analyze data sets to meet functional / non-functional business requirements. Identify, design, and implement data process: automating processes, optimizing data delivery, etc. Build infrastructure and tools to increase data ETL velocity. Work with data and analytics experts to implement and enhance analytic product features. Provide life cycle support the Operations team for existing products, services, and functionality assigned to the Data Engineering team. Experience, Education, And Certifications Bachelor’s degree in Computer Science, Statistics, Informatics or related field or equivalent work experience. 5+ years of Software Development experience, including 3+ years of experience in Data Engineer fields. Experience in building and optimizing big data pipelines, architectures, and data sets: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL databases, such as PostgreSQL, MySQL, etc. Experience with stream-processing systems: Flink, KSQL, Spark-Streaming, etc. Experience with programming languages, such as Java, Scala, Python, etc. Experience with cloud data engineering and development, such as AWS, etc. Additional Requirements Familiar with Agile software design processes and methodologies. Good analytic skills related to working with structured and unstructured datasets. Knowledge of message queuing, stream processing and scalable big data stores. Ownership/accountability for tasks/projects with on time and quality deliveries. Good verbal and written communication skills. Teamwork with independent design and development habits. Work with a sense of urgency and positive attitude. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

We deliver the world’s most complex projects Work as part of a collaborative and inclusive team Enjoy a varied & challenging role Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role Develop and implement data pipelines for ingesting and collecting data from various sources into a centralized data platform. Develop and maintain ETL jobs using AWS Glue services to process and transform data at scale. Optimize and troubleshoot AWS Glue jobs for performance and reliability. Utilize Python and PySpark to efficiently handle large volumes of data during the ingestion process. Collaborate with data architects to design and implement data models that support business requirements. Create and maintain ETL processes using Airflow, Python and PySpark to move and transform data between different systems. Implement monitoring solutions to track data pipeline performance and proactively identify and address issues. Manage and optimize databases, both SQL and NoSQL, to support data storage and retrieval needs. Familiarity with Infrastructure as Code (IaC) tools like Terraform, AWS CDK and others. Proficiency in event-driven integrations, batch-based and API-led data integrations. Proficiency in CICD pipelines such as Azure DevOps, AWS pipelines or Github Actions. About You To be considered for this role it is envisaged you will possess the following attributes: Technical and Industry Experience: Independent Integration Developer with over 5+ years of experience in developing and delivering integration projects in an agile or waterfall-based project environment. Proficiency in Python, PySpark and SQL programming language for data manipulation and pipeline development Hands-on experience with AWS Glue, Airflow, Dynamo DB, Redshift, S3 buckets, Event-Grid, and other AWS services Experience implementing CI/CD pipelines, including data testing practices. Proficient in Swagger, JSON, XML, SOAP and REST based web service development Behaviors Required: Driven by our values and purpose in everything we do. Visible, active, hands on approach to help teams be successful. Strong proactive planning ability. Optimistic, energetic, problem solver, ability to see long term business outcomes. Collaborative, ability to listen, compromise to make progress. Stronger together mindset, with a focus on innovation & creation of tangible / realized value. Challenge status quo. Education – Qualifications, Accreditation, Training: Degree in Computer Science and/or related fields AWS Data Engineering certifications desirable Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Mumbai Job Digital Solutions Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jun 4, 2025 Unposting Date Jul 4, 2025 Reporting Manager Title Director Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, PySpark, Core Banking Good to have skills : AWS BigData Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark. - Good To Have Skills: Experience with AWS BigData. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with data governance and data quality frameworks. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Warehouse. - This position is based in Pune. - A 15 years full time education is required. Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : No Additional Skills Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration techniques and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with programming languages such as Python or Scala. - Knowledge of data modeling and database design principles. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Title: SAP Analytics Service Analyst Location: Chennai We are one purpose-led global organisation. The enablers and innovators, ensuring that we can fulfil our mission to push the boundaries of science and discover and develop life-changing medicines. We take pride in working close to the cause, opening the locks to save lives, ultimately making a massive difference to the outside world. AstraZeneca (AZ) is in a period of strong growth and our employees have a united purpose to make a difference to patients around the world who need both our medicines and the ongoing developments from our science. In this journey AZ must continue to work across borders and with partners and new colleagues in a fast and seamless way. The ambition, size and complexity of the organisation, coupled with the opportunities afforded by new technology, has led the Board to approve a large-scale transformation programme – Axial. The Axial Programme will be powered by S/4HANA a new ERP (Enterprise Resource Planning) system which will be implemented right across the organisation and will provide our business with standardised processes, enhanced financial management, common data and real time reporting, transforming the way we work through our entire supply chain - from bench to patient. The new system will be used by more than 20,000 employees daily, is foundational to all AZ entities and is central to most core business processes. This is a once in a generation programme for AstraZeneca and will shape our ways of working globally for many years to come. The Axial programme needs the best talent to work in it. Whether it’s the technical skills, business understanding or change leadership, we want to ensure we have the strongest team deployed throughout. We are aiming to deliver a world class change programme that leaves all employees with a fuller understanding of their role in the end-to-end nature of our global company. This programme will provide AZ with a competitive edge, to the benefit of our employees, customers and patients. What You’ll Do We are looking for a service analyst to work with the Analytics Service Lead on AstraZeneca’s new SAP analytics and AI capabilities. We want to deliver amazing customer experiences and world class value; this cannot be achieved without great service management. As an SAP Analytics Service Analyst, you will contribute to: ITIL-based process design; service introduction; definitions of service metrics; change management; and managing operations on ServiceNow. You will also lead the on-going improvement of service once we go-live. Essential For The Role Proven experience of service management. Good hands on with ServiceNow Good hands on with JIRA Good understanding of ETL processes. Good with Data reconciliation. Desirable for the role Service management on a 24x7 business critical platform for a global company ITIL qualification Provide technical support to end-users, ensuring effective utilization of SAP analytics tools. Conduct workshops and training sessions to improve user competency in data analysis tools. Collaborate with IT and business teams to integrate various data sources and systems with SAP analytics. Assist in the development and implementation of new analytics solutions and enhancements. Ensure compliance with data governance policies and maintain standards for data accuracy and security. Implement best practices for data management and analytics reporting. Team leadership and building teams. Managing outsourced service providers. Familiarity with regulatory frameworks (GxP, GPDR, SOX). Knowledge of SAP ecosystems, including SAP S/4HANA, SAP Analytics Cloud, SAP Datasphere and Collibra. Process digitisation and automation. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience in pharmaceutical, healthcare, or manufacturing industries. Knowledge of Agile or hybrid project delivery methodologies. Why AstraZeneca? At Astrazeneca we’re dedicated to being a Great Place to Work. Where you are empowered to push the boundaries of science and unleash your entrepreneurial spirit. There’s no better place to make a difference to medicine, patients and society. An inclusive culture that champions diversity and collaboration, and always committed to lifelong learning, growth and development. We’re on an exciting journey to pioneer the future of healthcare. So, what’s next? Are you already imaging yourself joining our team? Good, because we can’t wait to hear from you. Are you ready to bring new ideas and fresh thinking to the table? Brilliant! We have one seat available and hope its yours If you’re curious to know more then we welcome your application no later than Where can I find out more? Our Social Media, Follow AstraZeneca on LinkedIn https://www.linkedin.com/company/1603/ Follow AstraZeneca on Facebook https://www.facebook.com/astrazenecacareers/ Follow AstraZeneca on Instagram https://www.instagram.com/astrazeneca_careers/?hl=en Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Job Title: Senior Analyst Career Level: D Introduction to role: Are you ready to transform raw data into actionable insights that drive strategic initiatives? As a Data Reporting and Analytics Developer at Alexion, you'll be at the forefront of designing, developing, and maintaining data reporting solutions and analytics platforms. Your expertise will empower data-driven decision-making across the organization. If you have a strong background in data analysis, proficiency in data visualization tools, experience with database management systems, and excellent communication skills, this role is perfect for you! Accountabilities: Support the Alexion team with field force reporting by designing, developing, validating, and maintaining Qlik Sense dashboards and supporting model tiers for various business units and indications. Understand business objectives, data sources, and key performance indicators (KPIs) to design effective solutions. Design and implement data models in QlikSense/Power BI, including ETL processes. Write and optimize SQL code to transform raw data into actionable insights; transform source data into dimensions/factors for dashboards. Integrate data from multiple sources, ensuring accuracy, consistency, and optimal performance. Develop interactive dashboards, reports, and visualizations using Qlik Sense/Power BI. Identify and address performance bottlenecks in Qlik applications; optimize data models, load scripts, and front-end visualizations for fast user experiences. Conduct thorough testing of Qlik/PBI applications to validate data accuracy, functionality, and usability. Collaborate with QA testers and business users to resolve issues promptly. Design intuitive user interfaces for data exploration, analysis, and insight generation. Work closely with cross-functional teams to align Qlik development efforts with organizational goals. Communicate project status, challenges, and recommendations to stakeholders clearly. Instill a culture of continuous improvement, testing, and deployment of new capabilities. Essential Skills/Experience: Advanced understanding/experience with SQL, Snowflake, and Veeva CRM. Ability to create new rules and adjust existing rules. Develop and maintain comprehensive technical documentation for data processes, SQL queries, and workflows. Document changes and updates to data models and reporting structures. Expertise in Power BI or Qlik scripting language + data modeling concepts, including related skills in: Data warehouse Data architecture Data visualization (inclusive of Vizlib extensions) Section access (security) Recent project experience with Qlik/PBI + experience with other BI tools Experience in documenting data processes, data lineage, and reporting frameworks. Desirable Skills/Experience: Background in computer science, information systems, or related field. 6-8 years of experience in developing reporting and visualization applications. Experience in web development (JavaScript and CSS). Excellent analytical and problem-solving skills with keen attention to detail. Ability to work independently and collaboratively in a dynamic environment. Strong communication and interpersonal skills. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca's Alexion division, you'll find an energizing culture where connections are built to explore new ideas that profoundly impact patients' lives. Diversity is valued here, fostering an inclusive environment where life-changing ideas can emerge from anywhere. We celebrate achievements and reward each other while maintaining kindness as a core value alongside our ambition to succeed for those in need. Our commitment extends beyond our work as we take pride in giving back to the communities we serve. Ready to make a difference? Apply now to join our team! Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities Partner with Data Science, Product Manager, Analytics, and Business teams to review and gather the data/reporting/analytics requirements and build trusted and scalable data models, data extraction processes, and data applications to help answer complex questions. Design and implement data pipelines to ETL data from multiple sources into a central data warehouse. Design and implement real-time data processing pipelines using Apache Spark Streaming. Improve data quality by leveraging internal tools/frameworks to automatically detect and mitigate data quality issues. Develop and implement data governance procedures to ensure data security, privacy, and compliance. Implement new technologies to improve data processing and analysis. Coach and mentor junior data engineers to enhance their skills and foster a collaborative team environment. Qualifications A BE in Computer Science or equivalent with 8+ years of professional experience as a Data Engineer or in a similar role Experience building scalable data pipelines in Spark using Airflow scheduler/executor framework or similar scheduling tools. Experience with Databricks and its APIs. Experience with modern databases (Redshift, Dynamo DB, Mongo DB, Postgres or similar) and data lakes. Proficient in one or more programming languages such as Python/Scala and rock-solid SQL skills. Champion automated builds and deployments using CICD tools like Bitbucket, Git Experience working with large-scale, high-performance data processing systems (batch and streaming) Our perks & benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities: Design, develop, and maintain both front-end and back-end components of web applications Write clean, efficient, and maintainable code using languages such as JavaScript, HTML5, jQuery, React, Python, or Nodejs Build front-end applications through appealing visual design Develop and manage databases, ensuring data integrity and security Create and maintain RESTful and GraphQL APIs Implement JWT and OAuth for secure authentication and authorization Implement automated testing frameworks and conduct thorough testing Manage the deployment process, including CI/CD pipelines Work with development teams and product managers to create efficient software solutions Lead and mentor junior developers, providing guidance and support Oversee the entire software development lifecycle from conception to deployment. Good to have: Bachelor’s degree or higher in Computer Science or a related field Prior 10+ years of experience as a Full Stack Developer or similar role Experience developing web and mobile applications Experience with version control systems like Git Proficient in multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery, ReactJs, Angular, ASP.NET) Proficient in multiple back-end languages (e.g. C#, Python, .NET Core) and JavaScript frameworks (e.g. Node.js, Django) Knowledge of databases (e.g. SQL, MySQL, MongoDB), web servers (e.g. Apache), UI/UX design Experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with containerization (Docker) and orchestration (Kubernetes) Understanding of software development principles and best practices Conduct regular code reviews to ensure code quality and adherence to standards Ability to work efficiently in a collaborative team environment Excellent problem-solving and analytical skills Experience with other JavaScript frameworks and libraries (e.g., Angular, Vue.js) Knowledge of DevOps practices and tools like Azure CI/CD, Jenkins, or GitLab CI Familiarity with data warehousing and ETL processes Experience with microservices architecture Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities: Design, develop, and maintain both front-end and back-end components of web applications Write clean, efficient, and maintainable code using languages such as JavaScript, HTML5, jQuery, React, Python, or Nodejs Build front-end applications through appealing visual design Develop and manage databases, ensuring data integrity and security Create and maintain RESTful and GraphQL APIs Implement JWT and OAuth for secure authentication and authorization Implement automated testing frameworks and conduct thorough testing Manage the deployment process, including CI/CD pipelines Work with development teams and product managers to create efficient software solutions Lead and mentor junior developers, providing guidance and support Oversee the entire software development lifecycle from conception to deployment. Good to have: Bachelor’s degree or higher in Computer Science or a related field Prior 10+ years of experience as a Full Stack Developer or similar role Experience developing web and mobile applications Experience with version control systems like Git Proficient in multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery, ReactJs, Angular, ASP.NET) Proficient in multiple back-end languages (e.g. C#, Python, .NET Core) and JavaScript frameworks (e.g. Node.js, Django) Knowledge of databases (e.g. SQL, MySQL, MongoDB), web servers (e.g. Apache), UI/UX design Experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with containerization (Docker) and orchestration (Kubernetes) Understanding of software development principles and best practices Conduct regular code reviews to ensure code quality and adherence to standards Ability to work efficiently in a collaborative team environment Excellent problem-solving and analytical skills Experience with other JavaScript frameworks and libraries (e.g., Angular, Vue.js) Knowledge of DevOps practices and tools like Azure CI/CD, Jenkins, or GitLab CI Familiarity with data warehousing and ETL processes Experience with microservices architecture Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities: Design, develop, and maintain both front-end and back-end components of web applications Write clean, efficient, and maintainable code using languages such as JavaScript, HTML5, jQuery, React, Python, or Nodejs Build front-end applications through appealing visual design Develop and manage databases, ensuring data integrity and security Create and maintain RESTful and GraphQL APIs Implement JWT and OAuth for secure authentication and authorization Implement automated testing frameworks and conduct thorough testing Manage the deployment process, including CI/CD pipelines Work with development teams and product managers to create efficient software solutions Lead and mentor junior developers, providing guidance and support Oversee the entire software development lifecycle from conception to deployment. Good to have: Bachelor’s degree or higher in Computer Science or a related field Prior 10+ years of experience as a Full Stack Developer or similar role Experience developing web and mobile applications Experience with version control systems like Git Proficient in multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery, ReactJs, Angular, ASP.NET) Proficient in multiple back-end languages (e.g. C#, Python, .NET Core) and JavaScript frameworks (e.g. Node.js, Django) Knowledge of databases (e.g. SQL, MySQL, MongoDB), web servers (e.g. Apache), UI/UX design Experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with containerization (Docker) and orchestration (Kubernetes) Understanding of software development principles and best practices Conduct regular code reviews to ensure code quality and adherence to standards Ability to work efficiently in a collaborative team environment Excellent problem-solving and analytical skills Experience with other JavaScript frameworks and libraries (e.g., Angular, Vue.js) Knowledge of DevOps practices and tools like Azure CI/CD, Jenkins, or GitLab CI Familiarity with data warehousing and ETL processes Experience with microservices architecture Show more Show less

Posted 3 days ago

Apply

6.0 - 11.0 years

25 - 40 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 25 to 40 LPA Exp: 7 to 11 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 3 days ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Pune/Bangalore/Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 3 days ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 3 days ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Company Description BEYOND SOFTWARES AND CONSULTANCY SERVICES Pvt. Ltd. (BSC Services) is committed to delivering innovative solutions to meet clients' evolving needs, particularly in the telecommunication industry. We provide a variety of software solutions for billing and customer management, network optimization, and data analytics. Our skilled team of software developers and telecom specialists collaborates closely with clients to understand their specific requirements and deliver high-quality, secure software solutions. We strive to build long-term relationships based on trust, transparency, and open communication, ensuring our clients stay competitive and grow in a dynamic market. Role Description We are looking for a Data Engineer with expertise in DBT and Airflow for a full-time remote position. The Data Engineer will be responsible for designing, developing, and managing data pipelines and ETL processes. Day-to-day tasks include data modeling, data warehousing, and implementing data analytics solutions. The role involves collaborating with cross-functional teams to ensure data integrity and optimize data workflows. Must Have Skills: 5 to 10 years IT Experience in data transformation in Amazon RedShift- Datawarehouse using Apache Airflow, Data Build Tool and Cosmos. Hands-on experience working in complex data warehouse implementations. Expert in Advance SQL. The Data Engineer will be responsible for designing, developing, testing and maintaining data pipelines using AWS RedShift, DBT, and Airflow. Experienced in Data Analytical skills. Minimum 5 years of Hands-on-Experience in Amazon RedShift Datawarehouse. Experience of dbt (Data Build Tool) for data transformation. Experience in developing, scheduling & monitoring workflow orchestration using Apache Airflow. Experienced in Astro & Cosmos library. Experience in construction of the DAG in Airflow. Experience of DevOps: BitBucket or Experience of Github /Gitlab Minimum 5 years of experience in Data Transformation projects . Development of data ingestion pipelines and robust ETL frameworks. Strong hands-on experience in analysing data on large datasets. Extensive experience in dimensional data modelling includes complex entity relationships and historical data entities. Implementation of data cleansing and data quality features in ETL pipelines. Implementation of data streaming solutions from different sources for data migration & transformation. · Extensive Data Engineering experience using Python. Experience in SQL and Performance Tuning. Hands on experience parsing responses generated by API's (REST/XML/JSON). Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Position Overview We are seeking a highly experienced and motivated Full Stack Developers to join our core development team. You will play a lead role in the architecture and development of our custom ERP system as well as Data Warehouse/Lake. With a focus on modular design, integration, and process optimization, the ideal candidates should posses deep backend experience, strong business logic acumen, and thrives in an Agile environment. Key Responsibilities Lead the development of backend modules for ERP systems using Python. Design and optimize workflows for inventory, CRM, POS, finance, and reporting modules. Integrate internal and external services via APIs (e.g., marketplaces, payment gateways). Write well-documented, testable, and scalable code using Agile best practices. Conduct peer code reviews and mentor junior developers. Collaborate with Product Owners and Scrum Masters in sprint planning and backlog grooming. Contribute to technical discussions and propose solutions for scalability and maintainability. Provide high-level and low-level technical design documents, database schema plans, and integration diagrams. Lead infrastructure planning, setup, and deployment on AWS (EC2, RDS, S3, Auto Scaling, Load Balancers, etc.). Ensure systems are scalable, highly available, and resilient to failure. Define integration strategies across internal ERP, data pipelines, third-party services, and analytics tools. Requirements 6+ years of software development experience . Strong coding background in Python, including ERP and backend systems. Proven experience setting up AWS infrastructure for scalable applications. Proficient in PostgreSQL or MySQL, Git, Docker, AWS, and Linux environments. Understanding of data pipelines, ETL flows, and warehouse models. Strong understanding of business processes and ERP logic. Experience with Docker, CI/CD, and infrastructure-as-code tools. Strong grasp of Agile/Scrum frameworks, ceremonies, and team collaboration. . Good communication skills and the ability to work cross-functionally. Excellent communication skills with both technical and non-technical stakeholders. Budget 8-12 Lacs P.A. Qualifications Min. University Degree Show more Show less

Posted 3 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies