Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking a passionate data analyst to transform data into actionable insights and support decision-making in a global organization focused on pricing and commercial strategy. This role spans business analysis, requirements gathering, data modeling, solution design, and visualization using modern tools. The analyst will also maintain and improve existing analytics solutions, interpret complex datasets, and communicate findings clearly to both technical and non-technical audiences. **Essential Functions of the Job:** - Analyze and interpret structured and unstructured data using statistical and quantitative methods to generate actionable insights and ongoing reports. - Design and implement data pipelines and processes for data cleaning, transformation, modeling, and visualization using tools such as Power BI, SQL, and Python. - Collaborate with stakeholders to define requirements, prioritize business needs, and translate problems into analytical solutions. - Develop, maintain, and enhance scalable analytics solutions and dashboards that support pricing strategy and commercial decision-making. - Identify opportunities for process improvement and operational efficiency through data-driven recommendations. - Communicate complex findings in a clear, compelling, and actionable manner to both technical and non-technical audiences. **Analytical/Decision Making Responsibilities:** - Apply a hypothesis-driven approach to analyzing ambiguous or complex data and synthesizing insights to guide strategic decisions. - Promote adoption of best practices in data analysis, modeling, and visualization, while tailoring approaches to meet the unique needs of each project. - Tackle analytical challenges with creativity and rigor, balancing innovative thinking with practical problem-solving across varied business domains. - Prioritize work based on business impact and deliver timely, high-quality results in fast-paced environments with evolving business needs. - Demonstrate sound judgment in selecting methods, tools, and data sources to support business objectives. **Knowledge and Skills Requirements:** - Proven experience as a data analyst, business analyst, data engineer, or similar role. - Strong analytical skills with the ability to collect, organize, analyze, and present large datasets accurately. - Foundational knowledge of statistics, including concepts like distributions, variance, and correlation. - Skilled in documenting processes and presenting findings to both technical and non-technical audiences. - Hands-on experience with Power BI for designing, developing, and maintaining analytics solutions. - Proficient in both Python and SQL, with strong programming and scripting skills. - Skilled in using Pandas, T-SQL, and Power Query M for querying, transforming, and cleaning data. - Hands-on experience in data modeling for both transactional (OLTP) and analytical (OLAP) database systems. - Strong visualization skills using Power BI and Python libraries such as Matplotlib and Seaborn. - Experience with defining and designing KPIs and aligning data insights with business goals. **Additional/Optional Knowledge and Skills:** - Experience with the Microsoft Fabric data analytics environment. - Proficiency in using the Apache Spark distributed analytics engine, particularly via PySpark and Spark SQL. - Exposure to implementing machine learning or AI solutions in a business context. - Familiarity with Python machine learning libraries such as scikit-learn, XGBoost, PyTorch, or transformers. - Experience with Power Platform tools (Power Apps, Power Automate, Dataverse, Copilot Studio, AI Builder). - Knowledge of pricing, commercial strategy, or competitive intelligence. - Experience with cloud-based data services, particularly in the Azure ecosystem (e.g., Azure Synapse Analytics or Azure Machine Learning). **Supervision Responsibilities:** - Operates with a high degree of independence and autonomy. - Collaborates closely with cross-functional teams including sales, pricing, and commercial strategy. - Mentors junior team members, helping develop technical skills and business domain knowledge. **Other Requirements:** - Collaborates with a team operating primarily in the Eastern Time Zone (UTC 4:00 / 5:00). - Limited travel may be required for this role. **Job Requirements:** **Education:** A bachelor's degree in a STEM field relevant to data analysis, data engineering, or data science is required. Examples include (but are not limited to) computer science, statistics, data analytics, artificial intelligence, operations research, or econometrics. **Experience:** 6+ years of experience in data analysis, data engineering, or a closely related field, ideally within a professional services environment. **Certification Requirements:** No certifications are required for this role. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 6.0 years
10 - 20 Lacs
pune
Remote
Work closely with clients to understand business needs, design data solutions, and deliver insights through end-to-end data management. Lead project execution, handle communication, documentation, and guide team members throughout. Required Candidate profile Must have hands-on experience with Python, ETL tools (Fivetran, StitchData), databases, and cloud platforms (AWS, GCP, Azure, Snowflake, Databricks). Familiarity with REST/SOAP APIs is essential.
Posted 1 week ago
4.0 - 6.0 years
1 - 2 Lacs
gurugram
Work from Office
Experience in Big Data technologies, specifically Spark, Python, Hive, SQL, Presto (or other query engines), big data storage formats (e.g., Parquet), orchestration tools (e.g., Apache Airflow) and version control (e.g. Bitbucket) Proficiency in developing configuration-based ETL pipelines and user-interface driven tools to optimize data processes and calculations (e.g., Dataiku). Experience in analysing business requirements, solution design, including the design of data models, data pipelines, and calculations, as well as presenting solution options and recommendations. Experience working in a cloud-based environment (ideally AWS), with a solid understanding of cloud computing concepts (EC2, S3), Linux, and containerization technologies (Docker and Kubernetes). A background in solution delivery within the finance or treasury business domains, particularly in areas such as Liquidity or Capital, is advantageous. Additional pointers: We are seeking for a mid-level engineer to design and build Liquidity calculations using our bespoke Data Calculation Platform (DCP), based on documented business requirements. The front-end of the DCP is Dataiku, but prior experience with Dataiku is not necessary if they've had experience working as a data engineer. Liquidity experience is not necessary, but it would be helpful if they've had experience designing and building to business requirements. There is a requirement to work 3 days in the office in Gurugram. They will work as part of a team that is located in both Sydney and Gurugram. The reporting manager will be based in Gurugram and project leadership in Sydney.
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
chennai, bengaluru
Work from Office
• Experience in cloud-based systems (GCP, BigQuery) • Strong SQL programming skills. • Expertise in database programming and performance tuning techniques. • Familiar with data movement techniques and best practices.
Posted 1 week ago
4.0 - 10.0 years
0 - 0 Lacs
chennai, tamil nadu
On-site
You will be responsible for utilizing your excellent Data Wrangling skills to query, join, and manipulate data, as well as automate processes as required. Your role will involve analyzing business and marketing program performance, including response, sales, incremental sales, net profit, and customer profiling. Proficiency in SQL, Alteryx, Qlik (dashboarding), Hadoop, GCP, and Microsoft Office is preferred for this position. As a subject matter expert, you will focus on sales, business, and financial reporting, along with Customer Analytics, Customer Experience Analytics, and Program Analytics. Additionally, you will develop and support Business Performance Metrics and Reporting (BPR) while having the opportunity to work with automotive service and parts business data. Direct interaction with clients to understand their business needs and present findings will be a crucial aspect of your role. You should be able to organize findings for business consumption, create insightful PowerPoints containing graphs, and effectively present your findings to clients and team members. Experience in complex problem solving, data manipulation, data quality, statistical analysis, and reporting is essential. Proficiency in utilizing SQL, Alteryx, Tableau, Qlik, etc., to manipulate, arrange, query, visualize, and present data will be a key requirement. Your strong interpersonal skills will enable you to collaborate effectively across departments and skill sets to meet client deliverables. Being a subject matter expert in an analytics specialty and holding a BA / BS or an advanced degree in relevant fields is preferred. With a minimum of 5 years of experience as an analyst, you should be an inventive and quick learner, capable of efficiently seeking out, learning, and applying new areas of expertise when required. Your highly self-motivated nature will allow you to work independently and respond effectively and timely to tasks and challenges. This position requires 4 to 10 years of experience, offering a salary ranging from 4 Lac 25 Thousand to 15 Lac P.A. within the IT Software - Application Programming / Maintenance industry. The ideal candidate should hold qualifications such as B.C.A, B.E, B.Sc, B.Tech, M.C.A, M.Tech, M.Sc, Other Bachelor Degree, or Post Graduate Diploma. Key skills for this role include Alteryx, SQL, Data Analyst, Data Engineer, MS Office, GCP, Hadoop, Qlik, Retention, Lincoln, Recall, Data Operations, and a proactive attitude towards learning and development.,
Posted 1 week ago
4.0 - 7.0 years
10 - 20 Lacs
bengaluru
Work from Office
Integrate various technologies and data sources within the Microsoft Azure ecosystem Strong proficiency in Python, SQL, and Databricks Hands-on experience with Azure data services and cloud-based data integration Familiarity with DataOps principles
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 5 years to 9 years of experience in working with Snowflake data, Oracle developer, SQL, Data engineering, and ETL. Additionally, having a certification in Project in Data Warehousing would be beneficial for this role.,
Posted 1 week ago
3.0 - 6.0 years
10 - 20 Lacs
pune
Remote
Work closely with clients to understand business needs, design data solutions, and deliver insights through end-to-end data management. Lead project execution, handle communication, documentation, and guide team members throughout. Required Candidate profile Must have hands-on experience with Python, ETL tools (Fivetran, StitchData), databases, and cloud platforms (AWS, GCP, Azure, Snowflake, Databricks). Familiarity with REST/SOAP APIs is essential.
Posted 1 week ago
4.0 - 8.0 years
8 - 12 Lacs
bengaluru
Work from Office
Solid experience in writing and optimizing complex SQL queries. Proficiency with at least one modern BI tool (Sigma Computing, Tableau, Looker, or Power BI). Exposure to Python for data transformations/automation Familiarity with cloud data platforms
Posted 1 week ago
3.0 - 5.0 years
5 - 10 Lacs
hyderabad, pune, bengaluru
Work from Office
Python + SQL experts in the Digital Marketing domain within the Data & Analytics space typically play a critical role in driving data-informed marketing strategies. Heres a breakdown of their role and responsibilities : Role Overview: These professionals analyse digital marketing data using Python and SQL to extract insights, optimize campaigns, and support decision-making. They serve as the bridge between raw marketing data and actionable business intelligence. Key Responsibilities: Data Extraction & Transformation: Use SQL to query large marketing databases (e.g., from Google Analytics, ad platforms, CRMs). Clean, transform, and prepare datasets using Python (Pandas, NumPy). Marketing Analytics: Track performance of digital campaigns (SEO, SEM, social media, email marketing). Build marketing attribution models to understand ROI. Dashboard & Reporting: Develop automated reports and dashboards using tools like Tableau, Power BI, or Python libraries (e.g., Plotly, Dash). Provide stakeholders with real-time insights. Predictive Modelling & Insights: Build models for customer segmentation, churn prediction, LTV analysis, and campaign optimization. Use machine learning (e.g., scikit-learn, XGBoost) for targeting and personalization. Cross-functional Collaboration: Work with marketing, product, and sales teams to define KPIs and optimize strategies. Translate business questions into data problems and communicate results clearly. Preferred candidate profile
Posted 1 week ago
6.0 - 11.0 years
6 - 13 Lacs
pune
Work from Office
My linkedin linkedin.com/in/yashsharma1608. contract period - 6-12month payroll will be - ASV consulting , my company client - Team Computers Job location - Pune - onsite(WFO) budget - upto 12-13lpa , depending on last (relevant hike) Exprnce - 6+ **************JD is************************** 1)Data Architect or Senior Data Engineer Requirement As discussed, start working on the below requirement. Experience-minimum 6+years or more Immediate joiners for Pune Budget-1Lakh -1.10L per month Required Skills & Qualifications Bachelors or Master’s degree in Computer Science, Information Systems, Data Management, or a related field. Proven experience as a Data Architect or Sr Data Engineer Strong expertise in data modeling, database design (SQL & NoSQL), and ETL processes. Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data ecosystems (Snowflake, Databricks, BigQuery, Redshift, etc.). Knowledge of big data technologies (Hadoop, Spark, Kafka) is a plus. Proficiency in data governance, data security, and compliance frameworks. Strong problem-solving and analytical skills with the ability to handle complex data challenges. Excellent communication skills to collaborate with both technical and non-technical stakeholders. Experience in data lake and data warehouse Experience in data management
Posted 1 week ago
5.0 - 7.0 years
12 - 19 Lacs
pune
Hybrid
• Strong programming skills in Python and SQL. • Hands-on experience with Databricks and Apache Spark. • Proficiency in working with RESTful APIs and data integration techniques•Experience CI/CD tools such as Jenkins, GitHub Actions, or Azure DevOp
Posted 1 week ago
4.0 - 9.0 years
7 - 17 Lacs
bengaluru
Work from Office
Job Description Azure Data Engineer Company: CGI Position: Azure Data Engineer Experience: 4 – 8 Years Location: Bangalore (Yemlur) Interview Mode: Face-to-Face (6th September 2025, Saturday) Compensation: Up to 16 – 17 LPA Notice Period: Only September Joiners considered Job ID: J0725-0611 Role Overview We are looking for an experienced Azure Data Engineer with strong expertise in modern data platforms, data integration, and advanced analytics. The ideal candidate must have hands-on experience in Azure Data Factory, Databricks, Python, and SQL , and should be able to design, develop, and optimize scalable data pipelines and solutions in Azure Cloud. Key Responsibilities Design and build highly scalable data pipelines using Azure Data Factory and Azure Databricks . Develop ETL workflows to extract, transform, and load data from multiple sources into Azure data platforms. Write optimized Python and SQL scripts for data cleansing, transformation, and reporting. Work with structured/unstructured data and implement best practices in data modeling. Collaborate with business analysts, architects, and data scientists to deliver high-quality solutions. Ensure data quality, security, and governance across the platform. Monitor and optimize system performance, troubleshoot production issues, and implement preventive measures. Stay updated with the latest Azure services and recommend improvements in architecture and processes. Mandatory Skills Azure Data Engineering (hands-on experience designing and deploying solutions). Azure Data Factory – Data pipelines, orchestration, triggers, linked services. Azure Databricks – Spark-based data processing, notebooks, Delta Lake. Python – Advanced scripting for data transformation and automation. SQL – Strong expertise in writing complex queries, stored procedures, and performance tuning. Nice to Have (Preferred) Experience with Azure Synapse Analytics. Knowledge of CI/CD pipelines and DevOps for data workflows. Familiarity with cloud security, compliance, and monitoring tools. Exposure to Agile methodology and working in distributed teams. Candidate Requirements Minimum 4+ years of overall experience , with 3+ years in Azure Data Engineering . Strong hands-on expertise across all mandatory skills (ADF, Databricks, Python, SQL). Must be available for Face-to-Face interview on 6th Sept (Saturday) at CGI – Yemlur, Bangalore . Must be an Immediate Joiner (September 2025) . Excellent communication skills and ability to work independently with minimal supervision.
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The role of Assistant Content Manager (ACM) as Asst Manager- Content Development in Bangalore provides a unique opportunity to embark on a journey of rapid learning, industry-relevant project development, collaboration with top professionals and faculty, team coordination, and direct impact on learners" careers. At upGrad, we are looking for passionate individuals who are deeply interested in education and technology to contribute towards designing cutting-edge learning programs tailored for working professionals. As an ACM, your responsibilities will include understanding the needs of the industry and learners to craft high-quality courses, overseeing and coordinating teams to ensure content quality and timely delivery, strategizing and implementing student assessments and engagement techniques, conducting research to enhance program-market fit, creating instructional content such as videos, assessments, and discussions in collaboration with internal teams, establishing and nurturing a network of subject matter experts and faculty members, defining content development processes to enhance learning experiences, as well as troubleshooting and optimizing content and program challenges post-launch. The mandatory requirements for this role include being a graduate in STEM (BTech/BE/BCA) with strong Python programming skills. Preferred skills for this position encompass knowledge of SQL, a preference for postgraduate qualifications (MTech/ME/MCA), teaching experience either online or offline, an inclination towards analytics and machine learning, experience in developing and managing digital educational content, and possessing strong problem-solving abilities along with structured process management skills. Join us at upGrad and be a part of a dynamic team committed to revolutionizing the e-learning landscape by delivering impactful and innovative learning experiences to professionals across various industries.,
Posted 1 week ago
4.0 - 12.0 years
0 Lacs
maharashtra
On-site
As an Azure Data Engineer specializing in Microsoft Fabric (Data Lake) based in Mumbai, you should have a minimum of 4 years of experience in the field, with at least 2 years dedicated to working with Microsoft Fabric technologies. Your expertise in Azure services is key, specifically in Data Lake, Synapse Analytics, Data Factory, Azure Storage, and Azure SQL. Your responsibilities will involve data modeling, ETL/ELT processes, and data integration patterns. It is essential to have experience in Power BI integration for effective data visualization. Proficiency in SQL, Python, or PySpark for data transformations is required for this role. A solid understanding of data governance, security, and compliance in cloud environments is also necessary. Previous experience working in Agile/Scrum environments is a plus. Strong problem-solving skills and the ability to work both independently and collaboratively within a team are crucial for success in this position.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
This role is with a big pharma company and is based in either Chennai or Mumbai. They are looking for a Data Engineer with at least 4 years of experience. The position is initially a long term contract which is expected to become permanent after 6 months. The ideal candidate should have expertise in Snowflake, SQL, and possess excellent communication skills.,
Posted 1 week ago
0.0 - 5.0 years
3 - 8 Lacs
vijayawada, visakhapatnam, hyderabad
Work from Office
If you know anyone Interested for the Position Let us know It would be helpful Job Title: Bench Sales Recruiter Location: Onsite , Somajiguda Experience Level: 6 months to 7 Years Contact: +91 78158 82181 - Aditya E- sukesh@cogentcube.com Note: Incentives will be credited in 15 days Food will be provided in the office office : # 6-3-1100/5, 3rd Floor, Raj Bhavan Road, Somajiguda, Hyderabad, Telangana, 500082 Role: IT RecruiterIndustry Type: Recruitment / StaffingDepartment: Human ResourcesEmployment Type: Full Time, PermanentRole Category: Recruitment & Talent Acquisition
Posted 1 week ago
6.0 - 8.0 years
15 - 25 Lacs
hyderabad, pune, gurugram
Work from Office
HIRING- SR. DATA ENGINEER LOC-Hyderabad,Chennai,Pune,Bangalore,Gurgaon EXP-6-8yrs CTC-Upto 28lpa NP-Imm-30days SKILLS-SQL,ETL,Data Transformation,SSIS,SSRS,PerformanceTuning,DataValidation& Reconciliation Drop your CV at rashibimaginators@gmail.com Required Candidate profile SKILLS-Senior Data Engineer (MSBI / ETL)SQL, ETL, Data Transformation, SSIS, SSRS, Performance Tuning, Data Validation & Reconciliation, ETL workflows, Build reports and dashboards
Posted 1 week ago
6.0 - 8.0 years
8 - 18 Lacs
hyderabad
Work from Office
Job title: Data Engineer ( Python + PySpark + SQL ) Candidate Specification: Minimum 6 to 8 years of experience in Data Engineer Job Description: Data Engineer with strong expertise in Python, PySpark, and SQL Expert in Kafka Design, develop, and maintain robust data pipelines using PySpark and Python Strong understanding of SQL and relational databases (eg, PostgreSQL, MySQL, SQL Server) Proficiency in Python for data engineering tasks and scripting Hands-on experience with PySpark in distributed data processing environments Strong command of SQL for data manipulation and querying large datasets
Posted 1 week ago
5.0 - 6.0 years
6 - 15 Lacs
hyderabad
Hybrid
About the Role : Join TA Digital (Credera), a global consulting firm delivering AI-driven customer experiences. As a Senior Salesforce CPQ Developer , you will design, develop, and implement advanced CPQ solutions tailored to client needs, driving efficiency across product configuration, pricing, and quoting processes. Key Responsibilities: Collaborate with product specialists and the Development Manager to understand complex requirements and translate them into Salesforce solutions Design and develop Salesforce CPQ solutions to streamline the sales process Customize Salesforce CPQ to meet business requirements, including product configuration, pricing, and quoting Collaborate with business stakeholders to gather and analyze requirements Integrate Salesforce CPQ with other Salesforce modules and third-party applications Develop and maintain custom Apex and Visualforce code, Lightning components, and other Salesforce technologies Perform data migration and integration tasks as needed Conduct unit testing and support user acceptance testing (UAT Troubleshoot and resolve issues related to Salesforce CPQ Stay updated with Salesforce CPQ best practices and new features Provide training and support to end-users and other team member Take ownership, work under pressure, and manage multiple projects simultaneously Interact with team members to deliver fast and reliable code, contribute ideas, provide feedback, and collaborate on various projects. Technical Skills you bring in: At least 5 years of Salesforce development experience in Sales, Service and CPQ cloud platforms. 2+ years of experience specifically with Salesforce CPQ Proficiency in Apex, Visualforce, Lightning components, and Salesforce administratioN Strong understanding of Salesforce CPQ capabilities, including product configuration, pricing rules, quote generation, advanced Approval Process, Contract Management, Amendment and Renewal. Experience with Salesforce integration tools and techniques (e.g., REST/SOAP APIs, MuleSoft) Excellent problem-solving skills and attention to detail Strong communication and interpersonal skills Ability to work independently and as part of a team
Posted 1 week ago
6.0 - 7.0 years
9 - 19 Lacs
hyderabad
Hybrid
About the Role : Join TA Digital (Credera), a global consulting firm delivering AI-driven customer experiences. As a Senior Salesforce CPQ Developer , you will design, develop, and implement advanced CPQ solutions tailored to client needs, driving efficiency across product configuration, pricing, and quoting processes. Key Responsibilities: Collaborate with product specialists and the Development Manager to understand complex requirements and translate them into Salesforce solutions Design and develop Salesforce CPQ solutions to streamline the sales process Customize Salesforce CPQ to meet business requirements, including product configuration, pricing, and quoting Collaborate with business stakeholders to gather and analyze requirements Integrate Salesforce CPQ with other Salesforce modules and third-party applications Develop and maintain custom Apex and Visualforce code, Lightning components, and other Salesforce technologies Perform data migration and integration tasks as needed Conduct unit testing and support user acceptance testing (UAT Troubleshoot and resolve issues related to Salesforce CPQ Stay updated with Salesforce CPQ best practices and new features Provide training and support to end-users and other team member Take ownership, work under pressure, and manage multiple projects simultaneously Interact with team members to deliver fast and reliable code, contribute ideas, provide feedback, and collaborate on various projects. Technical Skills you bring in: At least 5 years of Salesforce development experience in Sales, Service and CPQ cloud platforms. 2+ years of experience specifically with Salesforce CPQ Proficiency in Apex, Visualforce, Lightning components, and Salesforce administratioN Strong understanding of Salesforce CPQ capabilities, including product configuration, pricing rules, quote generation, advanced Approval Process, Contract Management, Amendment and Renewal. Experience with Salesforce integration tools and techniques (e.g., REST/SOAP APIs, MuleSoft) Excellent problem-solving skills and attention to detail Strong communication and interpersonal skills Ability to work independently and as part of a team
Posted 1 week ago
4.0 - 5.0 years
4 - 9 Lacs
hyderabad
Hybrid
About the Role : Join TA Digital (Credera), a global consulting firm delivering AI-driven customer experiences. As a Senior Salesforce CPQ Developer , you will design, develop, and implement advanced CPQ solutions tailored to client needs, driving efficiency across product configuration, pricing, and quoting processes. Key Responsibilities: Collaborate with product specialists and the Development Manager to understand complex requirements and translate them into Salesforce solutions Design and develop Salesforce CPQ solutions to streamline the sales process Customize Salesforce CPQ to meet business requirements, including product configuration, pricing, and quoting Collaborate with business stakeholders to gather and analyze requirements Integrate Salesforce CPQ with other Salesforce modules and third-party applications Develop and maintain custom Apex and Visualforce code, Lightning components, and other Salesforce technologies Perform data migration and integration tasks as needed Conduct unit testing and support user acceptance testing (UAT Troubleshoot and resolve issues related to Salesforce CPQ Stay updated with Salesforce CPQ best practices and new features Provide training and support to end-users and other team member Take ownership, work under pressure, and manage multiple projects simultaneously Interact with team members to deliver fast and reliable code, contribute ideas, provide feedback, and collaborate on various projects. Technical Skills you bring in: At least 5 years of Salesforce development experience in Sales, Service and CPQ cloud platforms. 2+ years of experience specifically with Salesforce CPQ Proficiency in Apex, Visualforce, Lightning components, and Salesforce administratioN Strong understanding of Salesforce CPQ capabilities, including product configuration, pricing rules, quote generation, advanced Approval Process, Contract Management, Amendment and Renewal. Experience with Salesforce integration tools and techniques (e.g., REST/SOAP APIs, MuleSoft) Excellent problem-solving skills and attention to detail Strong communication and interpersonal skills Ability to work independently and as part of a team
Posted 1 week ago
3.0 - 7.0 years
10 - 18 Lacs
bengaluru
Work from Office
Job Summary: We are seeking a skilled Data Engineer to maintain robust data infrastructure and pipelines that support our operational analytics and business intelligence needs. Candidates will bridge the gap between data engineering and operations, ensuring reliable, scalable, and efficient data systems that enable data-driven decision making across the organization. Responsibilities: Maintain ETL/ELT pipelines using modern data engineering tools and frameworks To support data pipeline health, performance, and SLA compliance Document data processes, schemas, and best practices SOP Implement data quality checks, monitoring, and alerting systems to ensure data reliability Optimize data pipeline performance and troubleshoot production issues Skills: Bachelor's degree in Computer Science, Engineering, Mathematics, or related field 3 to 6 years of experience in data engineering, software engineering, or related role Proven experience building and maintaining production data pipelines Expertise in Hadoop ecosystem - Spark SQL, Iceberg, Hive etc. Extensive experience with Apache Kafka, Apache Flink, and other relevant streaming technologies. Orchestrating tools - Apache Airflow & UC4 Proficiency in Python, Unix or similar languages Good understanding of SQL oracle, SQL server or similar languages and NoSQL technologies Proficiency with Version Control Git, CI/CD practices and collaborative development workflows Strong operations management and stakeholder communication skills Flexibility to work cross time zone Have cross-cultural communication mindset Experience working in cross-functional teams Continuous learning mindset and adaptability to new technologies We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.
Posted 1 week ago
4.0 - 9.0 years
5 - 11 Lacs
hyderabad, pune, bengaluru
Work from Office
Skill: Data Engineer / Python+SQL Experience: 4+ Years Location: Pan India Notice Period: Immediate - 15 Days JOB SUMMARY: Responsibilities: Python: min 4-5 years of experience. SQL : Min 2-3 Years of experience. Packages: Proficient with Pandas, Numpy, SQLAlchemy, and other database-related packages. Analytical Skills: Strong analytical and problem-solving abilities for troubleshooting and optimizing ETL processes. Data Structures & OOP: Good understanding of data structures and object-oriented programming. SQL & Databases: Proficiency in SQL and experience with databases like Oracle DB and MongoDB. SFTP: Experience working with SFTP using Python. APIs: Familiarity with Restful APIs, Python requests, and HTTP modules. Data Handling: Strong skills in data cleaning, transformation, and manipulation. Data Analysis: Ability to analyze large datasets, identify patterns or issues, and perform data validation to ensure quality and integrity. Version Control: Proficiency with Git. Mandatory Skills: Python, SQL
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Principal Architect, you will be responsible for designing and implementing robust, scalable, and maintainable microservices architectures. Your expertise in agent frameworks such as Autogen, AWS Agent Framework, and LangGraph will be essential in developing efficient solutions. You will play a key role in integrating safe and reliable LLM mechanisms and implementing best practices in ML Ops, including CI/CD, model versioning, and monitoring. Your responsibilities will also include designing and implementing data pipelines for seamless data ingestion, transformation, and storage. Your strong hands-on experience with frontend, backend, and cloud technologies will be pivotal in ensuring the success of these projects. Key Skills: - Data Engineer - Artificial Intelligence (AI) - Machine Learning (ML) - Full Stack Development - LLM - Cloud Technologies If you are passionate about creating innovative solutions and have a strong background in these key skills, we encourage you to apply for this Full Time, Permanent position in the IT/Computers - Software industry. Join us in shaping the future of technology as a leader in Information Technology. Job Code: GO/JC/20864/2025 Recruiter Name: Maheshwari,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |