Jobs
Interviews

264 Aggregations Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Hands-on data automation engineer with strong Python or Java coding skills and solid SQL expertise, who can work with large data sets, understand stored procedures, and independently write data-driven automation logic. Develop and execute test cases with a focus on Fixed Income trading workflows. The requirement goes beyond automation tools and aligns better with a junior developer or data automation role. Desired Skills and experience :- Strong programming experience in Python (preferred) or Java. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Hands-on experience with SQL, including: Writing and debugging complex queries (joins, aggregations, filtering, etc.) Understanding stored procedures and using them in automation Experience working with data structures, large tables and datasets Comfort with data manipulation, validation, and building comparison scripts Nice to have: Familiarity with PyCharm, VS Code, or IntelliJ for development and understanding of how automation integrates into CI/CD pipelines Prior exposure to financial data or post-trade systems (a bonus) Excellent communication skills, both written and verbal Experience of working with test management tools (e.g., X-Ray/JIRA). Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key Responsibilities :- Write custom data validation scripts based on provided regression test cases Read, understand, and translate stored procedure logic into test automation Compare datasets across environments and generate diffs Collaborate with team members and follow structured automation practices Contribute to building and maintaining a central automation script repository Establish and implement comprehensive QA strategies and test plans from scratch. Develop and execute test cases with a focus on Fixed Income trading workflows. Driving the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in the Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on different environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and managing client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT

Posted 1 day ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. SAP Native Hana Developer Technical Skills Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 1 day ago

Apply

0 years

4 - 7 Lacs

Gurgaon

On-site

Job Purpose Hands-on data automation engineer with strong Python or Java coding skills and solid SQL expertise, who can work with large data sets, understand stored procedures, and independently write data-driven automation logic. Develop and execute test cases with a focus on Fixed Income trading workflows. The requirement goes beyond automation tools and aligns better with a junior developer or data automation role. Desired Skills and experience Strong programming experience in Python (preferred) or Java. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Hands-on experience with SQL, including: o Writing and debugging complex queries (joins, aggregations, filtering, etc.) o Understanding stored procedures and using them in automation Experience working with data structures, large tables and datasets Comfort with data manipulation, validation, and building comparison scripts Nice to have: Familiarity with PyCharm, VS Code, or IntelliJ for development and understanding of how automation integrates into CI/CD pipelines Prior exposure to financial data or post-trade systems (a bonus) Excellent communication skills, both written and verbal Experience of working with test management tools (e.g., X-Ray/JIRA). Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key Responsibilities Write custom data validation scripts based on provided regression test cases Read, understand, and translate stored procedure logic into test automation Compare datasets across environments and generate diffs Collaborate with team members and follow structured automation practices Contribute to building and maintaining a central automation script repository Establish and implement comprehensive QA strategies and test plans from scratch. Develop and execute test cases with a focus on Fixed Income trading workflows. Driving the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in the Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on different environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and managing client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 1 day ago

Apply

3.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category: Testing/Quality Assurance Main location: India, Karnataka, Bangalore Position ID: J0725-1442 Employment Type: Full Time Position Description: Position Description Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: ETL Testing Engineer Position: Senior test engineer Experience: 3-9 Years Category: Quality assurance/Software Testing. Shift: 1-10 pm/UK Shift Main location: Chennai/Bangalore. Position ID: J0725-1442 Employment Type: Full Time Position Description: We are looking for an experienced DataStage tester to join our team. The ideal candidate should be passionate about coding and testing scalable and high-performance applications Your future duties and responsibilities: Develop and execute ETL test cases to validate data extraction, transformation, and loading processes. Write complex SQL queries to verify data integrity, consistency, and correctness across source and target systems. Automate ETL testing workflows using Python, PyTest, or other testing frameworks. Perform data reconciliation, schema validation, and data quality checks. Identify and report data anomalies, performance bottlenecks, and defects. Work closely with Data Engineers, Analysts, and Business Teams to understand data requirements. Design and maintain test data sets for validation. Implement CI/CD pipelines for automated ETL testing (Jenkins, GitLab CI, etc.). Document test results, defects, and validation reports. Required qualifications to be successful in this role: ETL Testing: Strong experience in testing Informatica, Talend, SSIS, Databricks, or similar ETL tools. SQL: Advanced SQL skills (joins, aggregations, subqueries, stored procedures). Python: Proficiency in Python for test automation (Pandas, PySpark, PyTest). Databases: Hands-on experience with RDBMS (Oracle, SQL Server, PostgreSQL) & NoSQL (MongoDB, Cassandra). Big Data Testing (Good to Have): Hadoop, Hive, Spark, Kafka. Testing Tools: Knowledge of Selenium, Airflow, Great Expectations, or similar frameworks. Version Control: Git, GitHub/GitLab. CI/CD: Jenkins, Azure DevOps, or similar. Soft Skills: Strong analytical and problem-solving skills. Ability to work in Agile/Scrum environments. Good communication skills for cross-functional collaboration. Preferred Qualifications: Experience with cloud platforms (AWS, Azure). Knowledge of Data Warehousing concepts (Star Schema, Snowflake Schema). Certification in ETL Testing, SQL, or Python is a plus. Skills: Data Warehousing MS SQL Server Python What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Purpose Hands-on data automation engineer with strong Python or Java coding skills and solid SQL expertise, who can work with large data sets, understand stored procedures, and independently write data-driven automation logic. Develop and execute test cases with a focus on Fixed Income trading workflows. The requirement goes beyond automation tools and aligns better with a junior developer or data automation role. Desired Skills And Experience Strong programming experience in Python (preferred) or Java. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Hands-on experience with SQL, including: Writing and debugging complex queries (joins, aggregations, filtering, etc.) Understanding stored procedures and using them in automation Experience working with data structures, large tables and datasets Comfort with data manipulation, validation, and building comparison scripts Nice to have: Familiarity with PyCharm, VS Code, or IntelliJ for development and understanding of how automation integrates into CI/CD pipelines Prior exposure to financial data or post-trade systems (a bonus) Excellent communication skills, both written and verbal Experience of working with test management tools (e.g., X-Ray/JIRA). Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key Responsibilities Write custom data validation scripts based on provided regression test cases Read, understand, and translate stored procedure logic into test automation Compare datasets across environments and generate diffs Collaborate with team members and follow structured automation practices Contribute to building and maintaining a central automation script repository Establish and implement comprehensive QA strategies and test plans from scratch. Develop and execute test cases with a focus on Fixed Income trading workflows. Driving the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in the Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on different environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and managing client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 2 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role Overview: We are looking for an Associate/Sr. Associate with strong skills in SQL and Python to join our Last Mile team. You will be responsible for designing and building dashboards that provide actionable insights to support business decisions. The ideal candidate is analytical, detail-oriented, and passionate about turning raw data into clear visual stories. Key Responsibilities: Develop, maintain, and optimize dashboards using business intelligence tools (e.g., Tableau, Power BI, Looker) Write efficient SQL queries to extract and analyze data from relational databases Use Python for data cleaning, transformation, and basic automation tasks Work with cross-functional teams to understand data requirements and deliver meaningful visualizations Ensure accuracy, consistency, and quality of data presented in reports Identify trends, anomalies, and opportunities from data and communicate findings effectively Required Skills: Proficient in SQL (joins, aggregations, window functions) Experience in Python for data processing (Pandas, NumPy Hands-on experience with dashboarding tools (e.g., Tableau, Power BI, Google Data Studio). Strong data visualization and storytelling skills. Ability to interpret business needs and translate them into a technical solution Basic understanding of statistics or business metrics is a plus Bachelor’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field.

Posted 2 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Zenoti provides an all-in-one, cloud-based software solution for the beauty and wellness industry. Our solution allows users to seamlessly manage every aspect of the business in a comprehensive mobile solution: online appointment bookings, POS, CRM, employee management, inventory management, built-in marketing programs and more. Zenoti helps clients streamline their systems and reduce costs, while simultaneously improving customer retention and spending. Our platform is engineered for reliability and scale and harnesses the power of enterprise-level technology for businesses of all sizes Zenoti powers more than 30,000 salons, spas, medspas and fitness studios in over 50 countries. This includes a vast portfolio of global brands, such as European Wax Center, Hand & Stone, Massage Heights, Rush Hair & Beauty, Sono Bello, Profile by Sanford, Hair Cuttery, CorePower Yoga and TONI&GUY. Our recent accomplishments include surpassing a $1 billion unicorn valuation, being named Next Tech Titan by GeekWire, raising an $80 million investment from TPG, ranking as the 316th fastest-growing company in North America on Deloitte’s 2020 Technology Fast 500™. We are also proud to be recognized as a Great Place to Work CertifiedTM for 2021-2022 as this reaffirms our commitment to empowering people to feel good and find their greatness. To learn more about Zenoti visit: https://www.zenoti.com What will I be doing? Design, architect, develop and maintain components of Zenoti Collaborate with a team of product managers, developers, and quality assurance engineers to define, design and deploy new features and functionality Build software that ensures the best possible usability, performance, quality and responsiveness of features Work in a team following agile development practices (SCRUM) Learn to scale your features to handle 2x to 4x growth every year and manage code that has to deal with millions of records and terabytes of data Release new features into production every month and get real feedback from thousands of customers to refine your designs Be proud of what you work on, and obsess about the quality of your work. Join our team to do the best work of your career. What skills do I need? 6+ years' experience developing ETL solutions and data pipelines with expertise in processing trillions of records efficiently 6+ years' experience with SQL Server, T-SQL, stored procedures, and deep understanding of SQL performance tuning for large-scale data processing Strong understanding of ETL concepts, data modeling, and data warehousing principles with hands-on experience building data pipelines using Python Extensive experience with Big Data platforms including Azure Fabric, Azure Databricks, Azure Data Factory (ADF), Amazon Redshift, Apache Spark, and Delta Lake Expert-level SQL skills for complex data transformations, aggregations, and query optimization to handle trillions of records with optimal performance Hands-on experience creating data lakehouse architectures and implementing data governance and security best practices across Big Data platforms Strong logical, analytical, and problem-solving skills with ability to design and optimize distributed computing clusters for maximum throughput Excellent communication skills for cross-functional collaboration and ability to work in a fast-paced environment with changing priorities Experience with cloud-native data solutions including Azure Data Lake, Azure Synapse, and containerization technologies (Docker, Kubernetes) Proven track record of implementing CI/CD pipelines for data engineering workflows, automating data pipeline deployment, and monitoring performance at scale Benefits Attractive Compensation Comprehensive medical coverage for yourself and your immediate family An environment where wellbeing is high on priority – access to regular yoga, meditation, breathwork, nutrition counseling, stress management, inclusion of family for most benefit awareness building sessions Opportunities to be a part of a community and give back: Social activities are part of our culture; You can look forward to regular engagement, social work, community give-back initiatives Zenoti provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

Posted 2 days ago

Apply

0.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Date Posted: 2025-07-30 Country: India Location: North Gate Business Park Sy.No 2/1, and Sy.No 2/2, KIAL Road, Venkatala Village, Chowdeshwari Layout, Yelahanka, Bangalore, Karnataka 560064 Position Role Type: Unspecified Who we are: At Pratt & Whitney, we believe that powered flight has transformed – and will continue to transform – the world. That’s why we work with an explorer’s heart and a perfectionist’s grit to design, build, and service the world’s most advanced aircraft engines. We do this across a diverse portfolio – including Commercial Engines, Military Engines, Business Aviation, General Aviation, Regional Aviation, and Helicopter Aviation – and as a way of turning possibilities into realities for our customers. This is how we at Pratt & Whitney approach our work, and therefore we are inspired to go beyond. What are our expectations: P&WC engines are equipped with on-board data recorder solutions (FAST™/DCTU) that transmit wirelessly engine full flight data to P&WC Ground station File Processing Center for data processing and reporting for diagnostic and health management analysis to determine applicable engine maintenance tasks. The DPHM Ground team is developing applications to process engine full flight data and generate multiple reports and dataset distributed to various consumers. The DPHM Ground team is seeking a talented and system-aware Data Engineer to join our expanding data platform team. The ideal candidate will be responsible for supporting, developing, and optimizing our modern, event-driven data pipeline using Kafka streams, stateful processing with RocksDB, Microsoft SQL, and Power BI integrated dashboards. This role involves transforming legacy database components into a scalable data warehouse, enhancing observability, reliability, and performance, and converting raw data into actionable insights for clients and stakeholders. Join our team and be a part of evolving our real-time data pipeline hybrid infrastructure, ensuring high performance, resilience, and a scalable cloud-compatible environment for P&WC Ground Station. Qualifications You Must Have: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a related field. Minimum of 5 years of experience in data engineering or a similar role. Strong SQL development skills, including indexing, complex joins, window functions, stored procedures, and query optimization. Experience with data visualization tools such as Power BI or OpenSearch/Kibana. Proficiency in Microsoft Excel for data manipulation and reporting. Familiarity with Java, Python, and C# for API development and maintenance. Exposure to stream processing, schema registries, API contract versioning and evolution, and stateful operations. Analytical & Performance Mindset: Ability to interpret large datasets, draw meaningful conclusions, and present insights effectively while considering latency, throughput, and operational cost. Communication Skills: Strong written and verbal communication skills to convey information between back-end developers, data analysts, and system engineers, while maintaining accountability and ownership of data design and change outcomes. Qualifications We Prefer: Data Pipeline Engineering & Event Streaming: Architect and maintain real-time streaming pipelines using Kafka Streams, implementing key-based aggregations, windowing, and stateful operations backed by RocksDB. Design event schemas and API contracts that serve both internal components and downstream consumers with minimal coupling. Data Ingestion & Persistence: Upgrade and maintain ingestion logic to persist processed outputs into structured databases, primarily Microsoft SQL Server for on-prem deployments and cloud-native databases in AWS. Enhance and maintain database APIs for both batch and real-time data consumers throughout the Ground and Analytics pipeline. Database Optimization & Complex Query Engineering: Optimize SQL queries and stored procedures for high-volume transactional loads. Collaborate with data analysts and business units to model data tables and relations that support Power BI needs. Fine-tune indexing strategies, partitioning, and caching logic. System Monitoring, Observability & Quality: Instrument data pipeline components and APIs with structured logs for ingestion into OpenSearch and visualization in Kibana. Conduct continuous quality checks during data transformation and ingestion phases to ensure data traceability and capture anomalies. Cross-Functional Collaboration: Work closely with the team to test and validate data pipeline artifacts. Support internal and external developers in producing and consuming data pipelines and APIs through documentation and well-defined contracts/schemas. What We Offer Long-term deferred compensation programs Daycare for young children Advancement programs to enhance education skills Flexible work schedules Leadership and training programs Comprehensive benefits, savings, and pension plans Financial support for parental leave Reward programs for outstanding work Work Location: Bangalore Employment Type: Full-time RTX adheres to the principles of equal employment. All qualified applications will be given careful consideration without regard to ethnicity, color, religion, gender, sexual orientation or identity, national origin, age, disability, protected veteran status or any other characteristic protected by law. Privacy Policy and Terms: Click on this link to read the Policy and Terms

Posted 3 days ago

Apply

7.0 - 9.0 years

10 - 20 Lacs

Noida

On-site

We are looking for a Senior Data Analyst with a strong background in reporting, data validation, and business analysis. The ideal candidate will be responsible for ensuring data accuracy, designing insightful dashboards, automating recurring reporting tasks, and supporting stakeholders with data-driven recommendations that drive business growth. Key Responsibilities Build, automate, and maintain interactive reports and dashboards using Power BI, Tableau, or Excel. Conduct data validation, reconciliation, and quality checks across multiple data sources (e.g., SQL databases, APIs, spreadsheets). Analyze large datasets to identify trends, anomalies, and actionable insights for business teams. Collaborate with cross-functional teams to understand data needs and deliver customized reporting solutions. Use SQL and Python to automate recurring analysis and reporting workflows. Present key findings and performance reports to business stakeholders in a clear, concise manner. Maintain consistency with data governance, accuracy standards , and reporting best practices. Technical Skills Required Strong experience with SQL (complex queries, joins, aggregations, CTEs, etc.). Proficiency in Power BI, Tableau , or Google Data Studio . Hands-on experience with Python for data manipulation and reporting automation. Advanced knowledge of Microsoft Excel (PivotTables, formulas, data modeling). Understanding of data structures , KPI tracking , and business reporting metrics . Familiarity with data warehousing concepts is a plus. Qualifications Bachelor’s or Master’s degree in Computer Science, Data Analytics, Statistics , or a related field. 7–9 years of experience in data analysis, reporting, or business intelligence roles in an IT or technology-driven environment. Strong communication skills with the ability to explain complex data to non-technical stakeholders. Job Type: Full-time Pay: ₹1,000,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Application Question(s): Notice Period Experience: Total Work: 7 years (Preferred) SQL: 7 years (Preferred) Python: 5 years (Preferred) Work Location: In person

Posted 3 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Location:, Hyderabad, Work Model: Hybrid (3 days from office) Experience Required: 7+ years Role Type: Individual Contributor 6 Months of Contract position Role Summary We are seeking a QA Automation Engineer with 7+ years of hands-on experience in automation testing using Selenium WebDriver (Java), backend validation via SQL, and API testing through Postman. The ideal candidate will contribute to enterprise-scale automation suites, validate data across services, and collaborate in Agile delivery teams. This role emphasizes deep technical ability in writing, debugging, and managing test scripts, with a structured approach to test design and defect triaging. Candidates must demonstrate working knowledge of test frameworks like JUnit/TestNG, dependency tools like Maven, and collaboration platforms such as Git and JIRA. Must-Have Skills (with Required Depth) Skill Skill Depth Selenium WebDriver (Java) Must have independently designed and implemented automation test cases for complex, dynamic UIs. Candidate must demonstrate ability to build reusable page-object components, implement synchronization strategies using explicit waits, and handle DOM-level exceptions. SQL – Backend Validation Should be proficient in writing mid-to-complex queries using joins, aggregations, and subqueries to validate multi-table relationships. Must be able to debug data mismatches directly against Oracle/MySQL/SQL Server. API Testing – Postman Should have performed REST API validations using Postman. Must be able to test endpoints by setting headers/auth tokens, validate status codes, and assert payload structures (JSON/XML). Full automation of API suites is not required. JUnit / TestNG Must have independently managed test execution using annotations (@BeforeClass, @DataProvider), defined test groups, configured retries, and asserted results across functional modules. Maven / Gradle Should be capable of managing automation test suites via Maven — including configuring dependencies in pom.xml, executing test lifecycles (mvn test), and interpreting console output. BDD (Cucumber) Must have authored Gherkin-based feature files and collaborated with business analysts for scenario design. Step definition coding is not mandatory, but knowledge of how feature files plug into test execution is required. JIRA Should be proficient in documenting test cases, logging bugs, linking defects to epics/stories, and updating Agile boards. Git / GitHub Must be able to manage code via Git: branch creation, rebasing, conflict resolution, and using pull requests. Expected to demonstrate fluency in working with shared repositories. Agile/Scrum Must have worked within structured sprints, participated in ceremonies (stand-ups, retros, grooming), and contributed toward QA sprint goals independently. Nice-to-Have Skills Skill Skill Depth REST Assured (Java) Familiarity with automating API calls using REST Assured is a plus. Should know how to configure base URI, handle authentication tokens, and parse JSON response data. Not mandatory if Postman is well understood. Step Definitions (BDD) Prior experience writing Java-based step definitions using Cucumber-JUnit integration is desirable but not mandatory. CI/CD – Jenkins, GitHub Actions Should be aware of triggering builds, configuring jobs to run automated tests, and interpreting build logs. Ownership of pipeline setup is not required. Test Reporting – ExtentReports, Allure Exposure to integrating test reports into frameworks and customizing test logs into HTML/dashboard outputs is preferred. Cross-Browser Testing Should understand browser compatibility strategies. Experience running tests via Selenium Grid or services like BrowserStack is a bonus. Database Connectivity – JDBC Basic understanding of establishing JDBC connections to query data from within test automation scripts. Not a required component for this role.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: SQL Tester Location: Pune Experience Level: 5+ years Job Type: Full-time Job Description: We are looking for a skilled SQL Tester to join our QA team. The role involves validating backend data processes, writing complex SQL queries to verify data integrity, and working closely with developers and analysts to ensure high data quality and accuracy in software applications. Key Responsibilities: Perform backend data validation using SQL queries Write and execute test cases for database testing Identify and report data-related defects Work with ETL and BI teams to test data pipelines and reports Participate in requirement analysis and test planning Required Skills: Strong SQL skills (joins, subqueries, aggregations, etc.) Experience in database testing and data validation Understanding of relational databases (e.g., MySQL, SQL Server, Oracle) Familiarity with ETL processes and tools (nice to have) Good analytical and problem-solving skills Experience with test management and defect tracking tools Employee Benefits: Group Medical Insurance Cab facility Meals/snacks Continuous Learning Program Company Profile: Stratacent is a Global IT Consulting and Services firm, headquartered in Jersey City, NJ, with global delivery centres in Pune and Gurugram plus offices in USA, London, Canada and South Africa. We are a leading IT services provider focusing in Financial Services, Insurance, Healthcare and Life Sciences. We help our customers in their transformation journey and provides services around Information Security, Cloud Services, Data and AI, Automation, Application Development and IT Operations. URL - http://stratacent.com Stratacent India Private Limited is an equal opportunity employer and will not discriminate against any employee or applicant for employment on the basis of race, color, creed, religion, age, sex, national origin, ancestry, handicap, or any other factor protected by law.

Posted 4 days ago

Apply

0 years

3 - 6 Lacs

Bengaluru

On-site

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Greater Bengaluru Area

On-site

What if the work you did every day could impact the lives of people you know? Or all of humanity? At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Position Summary We are seeking a highly skilled Senior Data Engineer Developer with 5+ years of experience to join our talented team in Bangalore. In this role, you will be responsible for designing, implementing, and optimizing data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Additionally, you will bring strong domain expertise in operations organizations, with a focus on supply chain and manufacturing functions. If you're a seasoned data engineer with a proven track record of delivering impactful data solutions in operations contexts, we want to hear from you. Responsibilities Lead the design, development, and optimization of data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Apply strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing, to understand data requirements and deliver tailored solutions. Utilize big data processing frameworks such as Apache Spark to process and analyze large volumes of operational data efficiently. Implement data transformations, aggregations, and business logic to support analytics, reporting, and operational decision-making. Leverage cloud-based data platforms such as Snowflake to store and manage structured and semi-structured operational data at scale. Utilize dbt (Data Build Tool) for data modeling, transformation, and documentation to ensure data consistency, quality, and integrity. Monitor and optimize data pipelines and ETL processes for performance, scalability, and reliability in operations contexts. Conduct data profiling, cleansing, and validation to ensure data quality and integrity across different operational data sets. Collaborate closely with cross-functional teams, including operations stakeholders, data scientists, and business analysts, to understand operational challenges and deliver actionable insights. Stay updated on emerging technologies and best practices in data engineering and operations management, contributing to continuous improvement and innovation within the organization. All listed requirements are deemed as essential functions to this position; however, business conditions may require reasonable accommodations for additional task and responsibilities. Preferred Experience/Education/Skills Bachelor's degree in Computer Science, Engineering, Operations Management, or related field. 5+ years of experience in data engineering, with proficiency in Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing. Strong domain expertise in life sciences manufacturing equipment, with a deep understanding of industry-specific challenges, processes, and technologies. Experience with big data processing frameworks such as Apache Spark and cloud-based data platforms such as Snowflake. Hands-on experience with data modeling, ETL development, and data integration in operations contexts. Familiarity with dbt (Data Build Tool) for managing data transformation and modeling workflows. Familiarity with reporting and visualization tools like Tableau, Powerbi etc. Good understanding of advanced data engineering and data science practices and technologies like pypark, sagemaker, cloudera MLflow etc. Experience with SAP, SAP HANA and Teamcenter applications is a plus. Excellent problem-solving skills, analytical thinking, and attention to detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and operations stakeholders. Eagerness to learn and adapt to new technologies and tools in a fast-paced environment. We are a company deeply rooted in belonging, promoting an inclusive environment where employees feel valued and empowered to contribute to our mission. Built on a strong foundation, Illumina has always prioritized openness, collaboration, and seeking alternative perspectives to propel innovation in genomics. We are proud to confirm a zero-net gap in pay, regardless of gender, ethnicity, or race. We also have several Employee Resource Groups (ERG) that deliver career development experiences, increase cultural awareness, and offer opportunities to engage in social responsibility. We are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information. Illumina conducts background checks on applicants for whom a conditional offer of employment has been made. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable local, state, and federal laws. Background check results may potentially result in the withdrawal of a conditional offer of employment. The background check process and any decisions made as a result shall be made in accordance with all applicable local, state, and federal laws. Illumina prohibits the use of generative artificial intelligence (AI) in the application and interview process. If you require accommodation to complete the application or interview process, please contact accommodations@illumina.com. To learn more, visit: https://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf. The position will be posted until a final candidate is selected or the requisition has a sufficient number of qualified applicants. This role is not eligible for visa sponsorship.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position: Axiom Manager Responsibilities: Work as Manager on multiple clients as a part of regulatory reporting implementation team. Facilitate and encourage the necessary conversations between the stakeholders (client and/or onshore team) to determine requirements. Work independently with minimum supervision from the Onshore project team/client. Provide technical guidance to the team as well as client as needed. Be actively involved in project management. Training, motivating, mentoring and coaching Seniors and staffs to meet the project objectives. Proactively develop and impart training on new onboarded or upcoming initiatives for team members. Work on identifying process improvement areas and bring in the culture of automation. Requirements: 8+ years of overall experience in Finance industry with minimum 6 years of Development experience in Axiom Controller View. Good understanding of Axiom objects / functionalities - Data Sources, Data Models, Shorthand's, Portfolios, Aggregations, Freeform or Taxonomy, Tabular Report, Workflow, User Defined Functions, Sign-off, Freezing etc. Proficiency in development of Freeform or Taxonomy regulatory reports using Axiom Controller View. Good understanding of Regulatory Reporting financial products. Experience with any major relational database (Oracle, MySQL, SYBASE). Familiar with Axiom v10 architecture. Familiarity with Unix, shell scripting. Should be expert in Advanced SQL. Should have experience in leading or managing a team of Axiom professionals. Actively participate in the selection of new regulatory tools/frameworks and methodologies. Recommend and assist in its implementation. Functional understanding of US regulatory reports: Fed Reports (FR 9C/14Q, FFIEC reports, Liquidity Reporting 2052a). Nice to have: Experience in building Taxonomy Reports using Axiom from scratch. Experience in migration from V9 to V10 Axiom. Good understanding of other regulatory tools namely, Wolters Kluwers, OneSumX and Vermeg. Intermediate experience in Python Programming. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position: Axiom Manager Responsibilities: Work as Manager on multiple clients as a part of regulatory reporting implementation team. Facilitate and encourage the necessary conversations between the stakeholders (client and/or onshore team) to determine requirements. Work independently with minimum supervision from the Onshore project team/client. Provide technical guidance to the team as well as client as needed. Be actively involved in project management. Training, motivating, mentoring and coaching Seniors and staffs to meet the project objectives. Proactively develop and impart training on new onboarded or upcoming initiatives for team members. Work on identifying process improvement areas and bring in the culture of automation. Requirements: 8+ years of overall experience in Finance industry with minimum 6 years of Development experience in Axiom Controller View. Good understanding of Axiom objects / functionalities - Data Sources, Data Models, Shorthand's, Portfolios, Aggregations, Freeform or Taxonomy, Tabular Report, Workflow, User Defined Functions, Sign-off, Freezing etc. Proficiency in development of Freeform or Taxonomy regulatory reports using Axiom Controller View. Good understanding of Regulatory Reporting financial products. Experience with any major relational database (Oracle, MySQL, SYBASE). Familiar with Axiom v10 architecture. Familiarity with Unix, shell scripting. Should be expert in Advanced SQL. Should have experience in leading or managing a team of Axiom professionals. Actively participate in the selection of new regulatory tools/frameworks and methodologies. Recommend and assist in its implementation. Functional understanding of US regulatory reports: Fed Reports (FR 9C/14Q, FFIEC reports, Liquidity Reporting 2052a). Nice to have: Experience in building Taxonomy Reports using Axiom from scratch. Experience in migration from V9 to V10 Axiom. Good understanding of other regulatory tools namely, Wolters Kluwers, OneSumX and Vermeg. Intermediate experience in Python Programming. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

4.0 years

3 - 6 Lacs

Hyderābād

On-site

CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities: ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests – The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements: The CDP ETL C Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills: ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira. Minimum Qualifications: Bachelor's degree or equivalent 4+ Years' experience Excellent verbal C written communications skills Self-Starter, highly motivated Analytical mindset Company Summary: Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry's 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers 'end to end' marketing programs for some of the world's leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai-data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html

Posted 6 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture.The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests– The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling. ETL and FTP concepts. Advanced Analytics using SQL Functions. Cloud technologies - AWS, Snowflake. Able to decipher requirements, provide recommendations, and implement solutions within predefined. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management. When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent. 4+ Years' experience. Excellent verbal C written communications skills. Self-Starter, highly motivated. Analytical mindset. Company Summary Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry’s 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers ‘end to end’ marketing programs for some of the world’s leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai--data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html

Posted 1 week ago

Apply

4.0 years

18 - 20 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of Weekday's clients Salary range: Rs 1800000 - Rs 2000000 (ie INR 18-20 LPA) Min Experience: 4 years Location: Bangalore JobType: full-time Requirements We are seeking an experienced and detail-oriented Data Analyst with a strong background in SQL, PySpark, Python, and Power BI (PBI) to join our data and analytics team. As a Data Analyst, you will play a critical role in transforming raw data into actionable insights that drive strategic business decisions. You'll work closely with cross-functional teams including business, product, engineering, and marketing to understand data requirements, build robust data models, and deliver meaningful reports and dashboards. The ideal candidate has 4+ years of hands-on experience working in fast-paced, data-driven environments, with a strong command of data querying, scripting, and visualization. This is an excellent opportunity for someone who enjoys solving complex data problems and communicating insights to both technical and non-technical stakeholders. Key Responsibilities Data Extraction & Transformation: Use SQL and PySpark to extract, clean, transform, and aggregate large datasets from structured and unstructured sources. Data Analysis: Conduct exploratory and ad-hoc data analysis using Python and other statistical tools to identify trends, anomalies, and business opportunities. Dashboarding & Reporting: Design, develop, and maintain interactive dashboards and reports using Power BI to visualize KPIs, business metrics, and forecasts. Data Modeling: Build and maintain efficient and scalable data models to support reporting and analytics use cases. Business Collaboration: Partner with internal teams to gather requirements, understand business challenges, and deliver data-driven solutions. Performance Tracking: Monitor campaign and business performance, identify areas of improvement, and suggest data-backed strategies. Automation: Streamline and automate recurring reporting processes using Python scripting and PBI integrations. Data Governance: Ensure data accuracy, consistency, and compliance with privacy regulations and data governance frameworks. Documentation: Maintain comprehensive documentation of data workflows, pipelines, and dashboards for knowledge transfer and reproducibility. Required Skills And Qualifications Bachelor's or Master's degree in Computer Science, Data Science, Mathematics, Statistics, or a related field. 4+ years of professional experience as a Data Analyst or in a similar role involving large-scale data analysis. Strong expertise in SQL for data querying, joins, aggregations, and optimization techniques. Hands-on experience with PySpark for big data processing and distributed computing. Proficiency in Python for data manipulation, statistical analysis, and building automation scripts. Advanced working knowledge of Power BI for building reports, dashboards, and performing DAX calculations. Strong analytical thinking, with the ability to work independently and manage multiple projects simultaneously. Excellent communication and stakeholder management skills; ability to translate complex data into simple business insights. Familiarity with cloud platforms (Azure/AWS/GCP), data warehouses (Snowflake, Redshift, BigQuery), and version control tools (Git) is a plus

Posted 1 week ago

Apply

7.0 years

0 Lacs

India

On-site

Responsibilities: Analyze business and functional requirements related to hardship cases, loan restructures, and Customer communication logic in Experian PowerCurve. Design, write, and execute test cases, test scenarios, and data validation scripts for PowerCurve workflows. Validate automation logic around: Communication suppression (email/SMS/letter) Loan restructuring and hardship arrangements Workflow routing based on customer status (arrears, restructure, probation, delinquency) Perform detailed regression, UAT, and system integration testing. Database testing to validate data flow to downstream PCC tables. Strong proficiency in SQL (joins, subqueries, aggregations, DML & DDL commands) and Ability to write ad-hoc SQL queries for validating test results. Experience with SQL Server Management Studio (SSMS) or equivalent tools. Raise, track, and validate defects using tools such as JIRA, qTest or ALM. Collaborate closely with Business Analysts, Developers, and Business Stakeholders to ensure full coverage of test cases. Prepare daily/weekly status reports, test summaries, and contribute to go/no-go decisions. Participate in Agile ceremonies including backlog grooming, sprint planning, and retrospectives. Test Initiation Support QA manager in test initiation phase on the requirement analysis and test effort estimation Test Plan Review and understand the project-specific business requirement Document the question and get the answer using clarification trackers/ walkthrough Identify the testable requirements, impacted application, and process Identify the test environment and test data requirements Map the requirement in the traceability metrics Raise the test environment and test data requests Prepare the test cases and peer review Participate in the test review walkthroughs and capture the review comments Incorporate the review comments and baseline the test plan artefacts Test Execution Participate in the daily stand-up and raise the concerns and recommendations (if any) Work on test allocation based on the entry criteria Capture the test results and log the defects as per the defined template Participate in the defect triage and identify the resolutions with the help build team Retest and close the defects Support QA manager on the daily reports Periodically review the test results and traceability metrics and work with QA manager on any deviation Test Closure Review the traceability metrics and organise the test artifacts in the designated folders Review the exit criteria and update the checklist End state the defects with appropriate remarks Prepare the test summary report data points Participate in the retro / lesson learnt session and contribute Mandatory Skills Description: Required Skills & Experience: Overall 7+ years of experience as a Test Analyst. Strong experience testing Experian PowerCurve (ideally), or similar decisioning platforms in Collections domain (e.g., FICO). Exposure to PowerCurve workflow configurations (email/SMS/letter) decision trees, and route logic. Familiarity with data validation in complex relational databases (SQL queries). Understanding of loan lifecycle, credit/debit card operations, financial hardship, arrears handling, and communication channel prioritization (SMS, email, Letters). Experience in client facing IC role. Exposure to SDLC and STLC Banking Experience in Lending is mandatory, and good understanding of loan lifecycle Strong Experience in the testing of Core Banking System Test methodology - Waterfall, Agile, and DevOps Testing expertise - Requirement gathering, Test planning techniques, Defect management Exposure to collaboration tools - ex, Jira, Confluence, Teams SharePoint Exposure to test & defect management tools - ex, qTest and Jira Clear understanding on test governance Agile Methodology/ Kanban preferred Core Banking System Lending/ financial hardship processes Payment Processing Retail Banking

Posted 1 week ago

Apply

2.0 years

3 - 7 Lacs

India

On-site

We are looking for dynamic and experienced Technical Trainers to join our team and deliver high-quality training sessions to engineering and management students. Trainers will play a key role in preparing students for placement drives, internships, and real-world job scenarios. Key Responsibilities:Training Delivery: Conduct interactive and engaging classroom sessions in: DSA: Arrays, Linked Lists, Trees, Graphs, Sorting/Searching, Recursion, Dynamic Programming, etc. Excel: Functions, Charts, PivotTables, Data Analysis, Dashboards, and Advanced Tools. SQL: Data querying, Joins, Aggregations, Subqueries, Stored Procedures, and Performance Tuning. Explain concepts clearly and practically to students with varying levels of proficiency. Use hands-on exercises, live demonstrations, and assessments to ensure learning outcomes. Content Development & Planning: Design curriculum and session plans aligned with academic goals and industry standards. Create and maintain training materials, worksheets, and assignments. Update training content based on technological advancements and placement trends. Student Engagement & Evaluation: Conduct periodic assessments, coding tests, and quizzes to evaluate student learning. Offer mentorship, feedback, and one-on-one support to help students overcome technical challenges. Prepare students for technical interviews and aptitude rounds. Collaboration & Reporting: Work with the Training & Placement Cell to align training with placement requirements. Share periodic training feedback, reports, and progress updates. Required Skills & Qualifications:Technical Expertise: Strong knowledge in one or more of the following areas: DSA: With programming experience in C++, Java, or Python. Excel: Including advanced formulas, data visualization, and possibly VBA. SQL: With hands-on experience in writing and optimizing queries. Educational Background: Bachelor’s or Master’s degree in Computer Science, IT, Engineering, or related fields. Experience: Minimum 2 years of teaching/training experience in technical subjects. Experience working with college students or in an educational setup is preferred. Soft Skills: Excellent communication and presentation skills. Strong student-handling and mentoring abilities. Organized, patient, and passionate about teaching. Job Type: Full-time Pay: ₹25,000.00 - ₹60,000.00 per month Benefits: Paid sick time Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Power BI The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do / Responsibilities: Proficient in creating and maintaining Tableau workbooks and Power BI reports, including designing visualizations, creating calculated fields, and managing data sources. Skilled in data preparation tasks such as data cleaning, transformation, and aggregation using Tableau Prep Builder, Power Query, or similar tools. Knowledgeable in data visualization best practices, including effective representation of data using various chart types, colors, and layouts. Familiar with data warehousing concepts and techniques, such as dimensional modeling, star schema, and snowflake schema. Collaborates with business users to gather requirements, understand data sources, and translate business needs into technical solutions. Strong analytical and problem-solving skills, capable of analyzing complex datasets and deriving meaningful insights. Proficient in creating advanced calculated fields and parameters in Tableau and Power BI, including using nested functions, logical functions, and parameters to enhance data analysis. Experienced in performance tuning and optimization of Tableau and Power BI visualizations, identifying, and resolving performance bottlenecks, optimizing data queries, and improving dashboard loading times. Deep understanding of Tableau data extracts, blending, and joins, efficiently working with large datasets, creating data extracts for offline use, and blending data from multiple sources. In-depth knowledge of Power BI data modeling and DAX calculations, creating complex data models, writing DAX expressions for calculations and aggregations, and optimizing data models for performance. Familiarity with Tableau and Power BI server administration and configuration, installing, configuring, and managing Tableau Server and Power BI Service, as well as managing user permissions, schedules, and data sources. Experience integrating Tableau and Power BI visualizations into web applications or portals, embedding visualizations using APIs, integrating with authentication systems, and ensuring compatibility with different browsers and devices. Ability to write complex SQL queries and scripts for data analysis and manipulation, writing efficient queries for data extraction, transformation, and loading (ETL), and performing advanced data analysis using SQL functions and techniques. Excellent communication and presentation skills, able to explain technical concepts to non-technical stakeholders. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305695

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Power BI Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do / Responsibilities: Proficient in creating and maintaining Tableau workbooks and Power BI reports, including designing visualizations, creating calculated fields, and managing data sources. Skilled in data preparation tasks such as data cleaning, transformation, and aggregation using Tableau Prep Builder, Power Query, or similar tools. Knowledgeable in data visualization best practices, including effective representation of data using various chart types, colors, and layouts. Familiar with data warehousing concepts and techniques, such as dimensional modeling, star schema, and snowflake schema. Collaborates with business users to gather requirements, understand data sources, and translate business needs into technical solutions. Strong analytical and problem-solving skills, capable of analyzing complex datasets and deriving meaningful insights. Proficient in creating advanced calculated fields and parameters in Tableau and Power BI, including using nested functions, logical functions, and parameters to enhance data analysis. Experienced in performance tuning and optimization of Tableau and Power BI visualizations, identifying, and resolving performance bottlenecks, optimizing data queries, and improving dashboard loading times. Deep understanding of Tableau data extracts, blending, and joins, efficiently working with large datasets, creating data extracts for offline use, and blending data from multiple sources. In-depth knowledge of Power BI data modeling and DAX calculations, creating complex data models, writing DAX expressions for calculations and aggregations, and optimizing data models for performance. Familiarity with Tableau and Power BI server administration and configuration, installing, configuring, and managing Tableau Server and Power BI Service, as well as managing user permissions, schedules, and data sources. Experience integrating Tableau and Power BI visualizations into web applications or portals, embedding visualizations using APIs, integrating with authentication systems, and ensuring compatibility with different browsers and devices. Ability to write complex SQL queries and scripts for data analysis and manipulation, writing efficient queries for data extraction, transformation, and loading (ETL), and performing advanced data analysis using SQL functions and techniques. Excellent communication and presentation skills, able to explain technical concepts to non-technical stakeholders. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305692

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Power BI The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do / Responsibilities: Proficient in creating and maintaining Tableau workbooks and Power BI reports, including designing visualizations, creating calculated fields, and managing data sources. Skilled in data preparation tasks such as data cleaning, transformation, and aggregation using Tableau Prep Builder, Power Query, or similar tools. Knowledgeable in data visualization best practices, including effective representation of data using various chart types, colors, and layouts. Familiar with data warehousing concepts and techniques, such as dimensional modeling, star schema, and snowflake schema. Collaborates with business users to gather requirements, understand data sources, and translate business needs into technical solutions. Strong analytical and problem-solving skills, capable of analyzing complex datasets and deriving meaningful insights. Proficient in creating advanced calculated fields and parameters in Tableau and Power BI, including using nested functions, logical functions, and parameters to enhance data analysis. Experienced in performance tuning and optimization of Tableau and Power BI visualizations, identifying, and resolving performance bottlenecks, optimizing data queries, and improving dashboard loading times. Deep understanding of Tableau data extracts, blending, and joins, efficiently working with large datasets, creating data extracts for offline use, and blending data from multiple sources. In-depth knowledge of Power BI data modeling and DAX calculations, creating complex data models, writing DAX expressions for calculations and aggregations, and optimizing data models for performance. Familiarity with Tableau and Power BI server administration and configuration, installing, configuring, and managing Tableau Server and Power BI Service, as well as managing user permissions, schedules, and data sources. Experience integrating Tableau and Power BI visualizations into web applications or portals, embedding visualizations using APIs, integrating with authentication systems, and ensuring compatibility with different browsers and devices. Ability to write complex SQL queries and scripts for data analysis and manipulation, writing efficient queries for data extraction, transformation, and loading (ETL), and performing advanced data analysis using SQL functions and techniques. Excellent communication and presentation skills, able to explain technical concepts to non-technical stakeholders. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305695

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Power BI Consultant The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do / Responsibilities: Proficient in creating and maintaining Tableau workbooks and Power BI reports, including designing visualizations, creating calculated fields, and managing data sources. Skilled in data preparation tasks such as data cleaning, transformation, and aggregation using Tableau Prep Builder, Power Query, or similar tools. Knowledgeable in data visualization best practices, including effective representation of data using various chart types, colors, and layouts. Familiar with data warehousing concepts and techniques, such as dimensional modeling, star schema, and snowflake schema. Collaborates with business users to gather requirements, understand data sources, and translate business needs into technical solutions. Strong analytical and problem-solving skills, capable of analyzing complex datasets and deriving meaningful insights. Proficient in creating advanced calculated fields and parameters in Tableau and Power BI, including using nested functions, logical functions, and parameters to enhance data analysis. Experienced in performance tuning and optimization of Tableau and Power BI visualizations, identifying, and resolving performance bottlenecks, optimizing data queries, and improving dashboard loading times. Deep understanding of Tableau data extracts, blending, and joins, efficiently working with large datasets, creating data extracts for offline use, and blending data from multiple sources. In-depth knowledge of Power BI data modeling and DAX calculations, creating complex data models, writing DAX expressions for calculations and aggregations, and optimizing data models for performance. Familiarity with Tableau and Power BI server administration and configuration, installing, configuring, and managing Tableau Server and Power BI Service, as well as managing user permissions, schedules, and data sources. Experience integrating Tableau and Power BI visualizations into web applications or portals, embedding visualizations using APIs, integrating with authentication systems, and ensuring compatibility with different browsers and devices. Ability to write complex SQL queries and scripts for data analysis and manipulation, writing efficient queries for data extraction, transformation, and loading (ETL), and performing advanced data analysis using SQL functions and techniques. Excellent communication and presentation skills, able to explain technical concepts to non-technical stakeholders. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305692

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Power BI The position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. The individual will need to be adaptive and able to react quickly to changing business needs. Work you’ll do / Responsibilities: Proficient in creating and maintaining Tableau workbooks and Power BI reports, including designing visualizations, creating calculated fields, and managing data sources. Skilled in data preparation tasks such as data cleaning, transformation, and aggregation using Tableau Prep Builder, Power Query, or similar tools. Knowledgeable in data visualization best practices, including effective representation of data using various chart types, colors, and layouts. Familiar with data warehousing concepts and techniques, such as dimensional modeling, star schema, and snowflake schema. Collaborates with business users to gather requirements, understand data sources, and translate business needs into technical solutions. Strong analytical and problem-solving skills, capable of analyzing complex datasets and deriving meaningful insights. Proficient in creating advanced calculated fields and parameters in Tableau and Power BI, including using nested functions, logical functions, and parameters to enhance data analysis. Experienced in performance tuning and optimization of Tableau and Power BI visualizations, identifying, and resolving performance bottlenecks, optimizing data queries, and improving dashboard loading times. Deep understanding of Tableau data extracts, blending, and joins, efficiently working with large datasets, creating data extracts for offline use, and blending data from multiple sources. In-depth knowledge of Power BI data modeling and DAX calculations, creating complex data models, writing DAX expressions for calculations and aggregations, and optimizing data models for performance. Familiarity with Tableau and Power BI server administration and configuration, installing, configuring, and managing Tableau Server and Power BI Service, as well as managing user permissions, schedules, and data sources. Experience integrating Tableau and Power BI visualizations into web applications or portals, embedding visualizations using APIs, integrating with authentication systems, and ensuring compatibility with different browsers and devices. Ability to write complex SQL queries and scripts for data analysis and manipulation, writing efficient queries for data extraction, transformation, and loading (ETL), and performing advanced data analysis using SQL functions and techniques. Excellent communication and presentation skills, able to explain technical concepts to non-technical stakeholders. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305695

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies