Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
10 - 14 Lacs
hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Manhattan Warehouse Solutions Technical, Retail Marketing, Manhattan Active Order Management Functional Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that solutions are effectively implemented across multiple teams, while maintaining a focus on quality and efficiency in application delivery. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and align on project goals.- Mentor junior professionals and provide guidance on best practices in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Manhattan Warehouse Solutions Technical, Retail Marketing, Manhattan Active Order Management Functional.- Experience with application design and architecture.- Strong understanding of system integration and data flow.- Ability to troubleshoot and resolve technical issues efficiently.- Familiarity with agile methodologies and project management tools. Additional Information:- The candidate should have minimum 15 years of experience in Manhattan Warehouse Solutions Technical.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of ETL processes and data integration techniques.- Familiarity with database management systems such as Oracle or SQL Server.- Experience in data quality management and data governance practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Informatica PowerCenter.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
Project Role : Packaged/SaaS Application Engineer Project Role Description : Configure and support packaged or SaaS applications to adapt features, manage releases, and ensure system stability. Use standard tools, APIs, and low-code platforms to align solutions with business needs while preserving compatibility and performance. Must have skills : Tax Accounting Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Packaged/SaaS Application Engineer, we are seeking an experienced Tax Functional Consultant with in-depth knowledge of the SOVOS tax compliance platform to support our SAP landscape and ensure seamless integration, compliance, and reporting. The consultant will act as the bridge between business stakeholders, tax teams, and technical teams, driving end-to-end solutions for indirect tax compliance (VAT, GST, Sales & Use Tax, etc.) across multiple geographies. Roles & Responsibilities:- Act as the subject matter expert (SME) for SOVOS tax compliance in an SAP environment (S/4HANA or ECC).- Gather business requirements related to tax determination, calculation, and reporting for various countries.- Configure, maintain, and test SOVOS integration with SAP (including tax codes, jurisdiction assignments, and mapping).- Work with finance, tax, and IT teams to ensure legal compliance with indirect tax regulations globally.- Analyze and troubleshoot tax calculation issues between SAP and SOVOS.- Manage tax content updates in SOVOS, ensuring correct alignment with business rules and jurisdiction-specific changes.- Support end-to-end tax return filings and reconciliations from SOVOS output.- Document system processes, test cases, and user guides for business teams.- Provide knowledge transfer and training to internal teams on SOVOS functionalities. Professional & Technical Skills: - Must To Have Skills: Functional expertise in indirect tax processes (VAT, GST, Sales & Use Tax, Withholding Tax).- 35+ years of SAP functional consulting experience in FI, SD, or MM with strong tax configuration skills.- Hands-on experience with SOVOS platformpreferably SOVOS Global Tax Determination, VAT Reporting, and e-Invoicing modules.- Strong understanding of ETL data flows between SAP and SOVOS.- Ability to analyze tax rules, rates, and mapping requirements for global jurisdictions.- Excellent problem-solving skills to address discrepancies in tax calculations and reporting.- Experience in SAPSOVOS integration projects (configuration, mapping, and testing).- Strong documentation and communication skills. Additional Information:- The candidate should have minimum 3 years of experience in Tax Accounting.- This position is based at our Bengaluru office.- Exposure to SAP S/4HANA migration projects with SOVOS integration.- Knowledge of e-invoicing regulations in multiple countries.- Prior experience in large-scale, multi-country SAP deployments.- Familiarity with other tax engines (Vertex, Avalara) as an added advantage.- Bachelors degree in Accounting, Finance, Information Technology, or related field.- Professional certifications in SAP or SOVOS modules preferred- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
15 - 19 Lacs
bengaluru
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google BigQueryMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer - AI/ML, you will be responsible for designing, building, and maintaining scalable data pipelines and systems that power AI/ML applications on Google Cloud platforms. Your typical day will involve leveraging Google Clouds data services, implementing GenAI and AI/ML models, and supporting data-driven solutions through efficient architecture and engineering.________________________________________ Roles & Responsibilities:i.Design and develop scalable data pipelines and ETL processes using Google Cloud data services like BigQuery, Dataflow, Pub/Sub, and Dataproc.ii.Build and optimize data architectures to support AI/ML applications and model training at scale.iii.Collaborate with data scientists and ML engineers to implement data ingestion, feature engineering, and model-serving pipelines.iv.Develop and manage data integration solutions that align with enterprise data governance and security standards.v.Support GenAI/Vertex AI model deployment by ensuring reliable data access and transformation pipelines.vi.Implement monitoring, logging, and alerting for data workflows and ensure data quality across all stages.vii.Enable self-service analytics by building reusable data assets and data marts for business stakeholders.viii.Ensure cloud-native, production-grade data pipelines and participate in performance tuning and cost optimization.ix.Experience with programming languages such as Python, SQL, and optionally Java or Scala.________________________________________ Professional & Technical Skills: Must To Have Skills: Strong experience in Google Cloud Data Services (BigQuery, Dataflow, Pub/Sub) and hands-on with scalable data engineering pipelines.Good To Have Skills: GenAI/Vertex AI exposure, Cloud Data Architecture, PCA/PDE certifications.Understanding of data modeling, data warehousing, and distributed computing frameworks.Experience with AI/ML data pipelines, MLOps practices, and model deployment workflows.Familiarity with CI/CD and infrastructure-as-code tools (Terraform, Cloud Build, etc.) for data projects.________________________________________ Additional Information:The candidate should have a minimum of 3 years of experience in Google Cloud Data Engineering or related domains.The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of building and scaling data systems for AI/ML initiatives. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
15 - 19 Lacs
gurugram
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer - AI/ML, you will be responsible for designing, building, and maintaining scalable data pipelines and systems that power AI/ML applications on Google Cloud platforms. Your typical day will involve leveraging Google Clouds data services, implementing GenAI and AI/ML models, and supporting data-driven solutions through efficient architecture and engineering.________________________________________ Roles & Responsibilities:i.Design and develop scalable data pipelines and ETL processes using Google Cloud data services like BigQuery, Dataflow, Pub/Sub, and Dataproc.ii.Build and optimize data architectures to support AI/ML applications and model training at scale.iii.Collaborate with data scientists and ML engineers to implement data ingestion, feature engineering, and model-serving pipelines.iv.Develop and manage data integration solutions that align with enterprise data governance and security standards.v.Support GenAI/Vertex AI model deployment by ensuring reliable data access and transformation pipelines.vi.Implement monitoring, logging, and alerting for data workflows and ensure data quality across all stages.vii.Enable self-service analytics by building reusable data assets and data marts for business stakeholders.viii.Ensure cloud-native, production-grade data pipelines and participate in performance tuning and cost optimization.ix.Experience with programming languages such as Python, SQL, and optionally Java or Scala.________________________________________ Professional & Technical Skills: Must To Have Skills: Strong experience in Google Cloud Data Services (BigQuery, Dataflow, Pub/Sub) and hands-on with scalable data engineering pipelines.Good To Have Skills: GenAI/Vertex AI exposure, Cloud Data Architecture, PCA/PDE certifications.Understanding of data modeling, data warehousing, and distributed computing frameworks.Experience with AI/ML data pipelines, MLOps practices, and model deployment workflows.Familiarity with CI/CD and infrastructure-as-code tools (Terraform, Cloud Build, etc.) for data projects.________________________________________ Additional Information:The candidate should have a minimum of 5 years of experience in Google Cloud Data Engineering or related domains.The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of building and scaling data systems for AI/ML initiatives. Qualification 15 years full time education
Posted 3 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : GCP Dataflow Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in GCP Dataflow.- Strong understanding of data pipeline architecture and design principles.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud computing concepts and services, particularly within the Google Cloud Platform. Additional Information:- The candidate should have minimum 5 years of experience in GCP Dataflow.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
8.0 - 11.0 years
15 - 19 Lacs
pune
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google BigQueryMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer - AI/ML, you will be responsible for designing, building, and maintaining scalable data pipelines and systems that power AI/ML applications on Cloud platforms. Your typical day will involve leveraging Google Clouds data services, implementing GenAI and AI/ML models, and supporting data-driven solutions through efficient architecture and engineering.________________________________________ Roles & Responsibilities:i.Design and develop scalable data pipelines and ETL processes using Google Cloud data services like BigQuery, Dataflow, Pub/Sub, and Dataproc.ii.Build and optimize data architectures to support AI/ML applications and model training at scale.iii.Collaborate with data scientists and ML engineers to implement data ingestion, feature engineering, and model-serving pipelines.iv.Develop and manage data integration solutions that align with enterprise data governance and security standards.v.Support GenAI/Vertex AI model deployment by ensuring reliable data access and transformation pipelines.vi.Implement monitoring, logging, and alerting for data workflows and ensure data quality across all stages.vii.Enable self-service analytics by building reusable data assets and data marts for business stakeholders.viii.Ensure cloud-native, production-grade data pipelines and participate in performance tuning and cost optimization.ix.Experience with programming languages such as Python, SQL, and optionally Java or Scala.________________________________________ Professional & Technical Skills: Must To Have Skills: Strong experience in Google Cloud Data Services (BigQuery, Dataflow, Pub/Sub) and hands-on with scalable data engineering pipelines.Good To Have Skills: GenAI/Vertex AI exposure, Cloud Data Architecture, PCA/PDE certifications.Understanding of data modeling, data warehousing, and distributed computing frameworks.Experience with AI/ML data pipelines, MLOps practices, and model deployment workflows.Familiarity with CI/CD and infrastructure-as-code tools (Terraform, Cloud Build, etc.) for data projects.________________________________________ Additional Information:The candidate should have a minimum of 7 years of experience in Google Cloud Data Engineering or related domains.The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of building and scaling data systems for AI/ML initiatives. Qualification 15 years full time education
Posted 3 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Looker Data PlatformMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture.- Good To Have Skills: Experience with Google Looker Data Platform.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with programming languages such as Python or Java for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Google Cloud Platform Architecture.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
8.0 - 11.0 years
15 - 19 Lacs
chennai
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google BigQueryMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer - AI/ML, you will be responsible for designing, building, and maintaining scalable data pipelines and systems that power AI/ML applications on Cloud platforms. Your typical day will involve leveraging Google Clouds data services, implementing GenAI and AI/ML models, and supporting data-driven solutions through efficient architecture and engineering.________________________________________ Roles & Responsibilities:i.Design and develop scalable data pipelines and ETL processes using Google Cloud data services like BigQuery, Dataflow, Pub/Sub, and Dataproc.ii.Build and optimize data architectures to support AI/ML applications and model training at scale.iii.Collaborate with data scientists and ML engineers to implement data ingestion, feature engineering, and model-serving pipelines.iv.Develop and manage data integration solutions that align with enterprise data governance and security standards.v.Support GenAI/Vertex AI model deployment by ensuring reliable data access and transformation pipelines.vi.Implement monitoring, logging, and alerting for data workflows and ensure data quality across all stages.vii.Enable self-service analytics by building reusable data assets and data marts for business stakeholders.viii.Ensure cloud-native, production-grade data pipelines and participate in performance tuning and cost optimization.ix.Experience with programming languages such as Python, SQL, and optionally Java or Scala.________________________________________ Professional & Technical Skills: Must To Have Skills: Strong experience in Google Cloud Data Services (BigQuery, Dataflow, Pub/Sub) and hands-on with scalable data engineering pipelines.Good To Have Skills: GenAI/Vertex AI exposure, Cloud Data Architecture, PCA/PDE certifications.Understanding of data modeling, data warehousing, and distributed computing frameworks.Experience with AI/ML data pipelines, MLOps practices, and model deployment workflows.Familiarity with CI/CD and infrastructure-as-code tools (Terraform, Cloud Build, etc.) for data projects.________________________________________ Additional Information:The candidate should have a minimum of 9 years of experience in Google Cloud Data Engineering or related domains.The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of building and scaling data systems for AI/ML initiatives. Qualification 15 years full time education
Posted 3 weeks ago
6.0 - 8.0 years
22 - 25 Lacs
pune
Work from Office
Work Requirements Minimum Bachelors Degree required with 6-8 years experience in Oracle EBS R12 1. 6+ years of experience with Oracle applications e-business suite (11i or R12) as a Techno Functional Consultant. 2. 6+ years of experience with SQL and PL/SQL and SQL tuning including SQL and PL/SQL development tools 3. 5+ years of experience with Oracle Forms and Oracle Reports 4. 5+ years of experience with XML/BI publisher 5. 2+ years of experience with Oracle Workflow Builder 6. 2+ years of experience with a Unix Shell scripting. Specific Work Preferences 1. Knowledge of general business operating principles. 2. Advanced troubleshooting skills 3. Ability to multitask and maintain composure when working with the business users. 4. Good Techno Functional knowledge of Order to Cash Process (O2C) and Procure to Pay process (P2P). 5. Technical expertise with solid understanding of underlying data flow and functionality in Oracle modules likes Inventory, Shipping Execution, Order Management, Purchasing, iProcurement, WIP and BOM. 6. Good technical and functional knowledge in Supply Chain modules. 7. Expertise in Forms Personalization and customization. 8. Knowledge of Oracle Application Framework (OAF) and ADF a plus. 9. Knowledge of Oracle Mobile Forms development is a plus 10. Knowledge of Application Object Library 11. Excellent analytical and problem-solving skills 12. Excellent verbal and written communication skills Mandatory Key Skills troubleshooting,SQL,PL/SQL,SQL tuning,Oracle Forms,XML,Unix,Shell scripting,Oracle Application Framework,Oracle EBS R12,Oracle EBS*
Posted 3 weeks ago
7.0 - 9.0 years
13 - 17 Lacs
chennai
Work from Office
Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment. Mandatory Key Skills Python,Cloud Architecture,Java,data engineering,GCP,BigQuery,Astronomer,Terraform, Airflow*
Posted 3 weeks ago
7.0 - 12.0 years
19 - 22 Lacs
chennai
Work from Office
Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment. Mandatory Key Skills Big Query,Astronomer,Terraform,Airflow,Cloud Architecture,Java,Google Cloud Platform,Python,Software Development*
Posted 3 weeks ago
2.0 - 5.0 years
18 - 25 Lacs
bengaluru
Work from Office
Job Title: Data Engineer | Python & SQL | Intelligence AI Platform Experience: 2-5 Years Job Description:We are looking for a skilled Data Engineer (2-5 years) to join our team working on Intelligine, an AI-powered content creation and marketing platform. The ideal candidate will have strong expertise in Python, SQL, and ETL/ELT pipelines with a proven ability to work on scalable data solutions. Key Responsibilities: * Design, build, and maintain robust ETL/ELT pipelines using Python and SQL * Develop optimized SQL queries (joins, CTEs, window functions) * Integrate data from multiple sources: APIs, flat files, databases, and third-party tools * Optimize workflows, query performance, and database design * Collaborate with cross-functional teams to deliver secure and scalable data solutions * Automate processes and monitoring to ensure pipeline reliability Qualifications: * Bachelors/Masters in Computer Science, Data Engineering, or related field * 2-5 years of professional experience as a Data Engineer (Python & SQL focus) * Strong understanding of data structures, algorithms, and DBMS Attributes: * Strong analytical and problem-solving mindset * Effective communication and collaboration skills * Attention to detail and code quality * Self-driven, able to work independently or in teams
Posted 3 weeks ago
4.0 - 8.0 years
5 - 8 Lacs
chennai
Work from Office
Responsibilities What you'll do Engineer, test, document and manage GCP Dataproc, DataFlow and VertexAI services used in high-performance data processing pipelines and Machine Learning. Help developers optimize data processing jobs using Spark, Python, and Java. Collaborate with development teams to integrate data processing pipelines with other cloud services and applications. Utilize Terraform and Tekton for infrastructure as code (IaC) and CI/CD pipelines, ensuring efficient deployment and management. Good to have Experience with Spark for large-scale data processing. Solid understanding and experience with GitHub for version control and collaboration. Experience with Terraform for infrastructure management and Tekton for continuous integration and deployment. Experience with Apache NiFi for data flow automation. Knowledge of Apache Kafka for real-time data streaming. Familiarity with Google Cloud Pub/Sub for event-driven systems and messaging. Familiarity with Google BigQuery Mandatory Key Skills Python,Java,Google Cloud Pub/Sub,Apache Kafka,Big Query,CI/CD*,Machine Learning*,Spark*
Posted 3 weeks ago
3.0 - 5.0 years
10 - 13 Lacs
chennai
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Mandatory Key Skills Terraform,Tekton,Cloudrun,Cloud scheduler,Astronomer,Airflow,Kafka,Cloud Spanner streaming,Python development,Pandas,Pyspark,SQL,GCP,Big Query,Data Proc,Data flow,Cloud Storage*
Posted 3 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
hyderabad, pune, bengaluru
Work from Office
We are seeking a skilled Data Modeller to join our team at KPI Partners. The ideal candidate will be responsible for designing and implementing data models that effectively meet the business needs of our clients while ensuring data integrity and optimization. Key Responsibilities: - Collaborate with stakeholders to gather requirements and understand data architecture needs. - Design logical and physical data models that align with business objectives and requirements. - Optimize data models for performance and scalability. - Develop and maintain documentation for data models, including data flow diagrams and metadata repositories. - Ensure data quality by implementing data validation and verification procedures. - Work closely with database administrators and developers to ensure seamless data integration and implementation. - Participate in data governance and security initiatives. - Stay updated on industry trends and best practices in data modelling and management. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Modeller or similar role, with a track record of successful data modeling projects. - Strong understanding of data warehousing concepts and methodologies. - Proficiency in data modeling tools and technologies such as Erwin, Oracle SQL, or other relevant tools. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. If you are passionate about data and have a desire to drive successful outcomes through effective data modeling, we invite you to apply to join our dynamic team at KPI Partners in one of our locations in Hyderabad, Bangalore, or Pune.
Posted 3 weeks ago
3.0 - 5.0 years
10 - 15 Lacs
bengaluru
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
5.0 - 7.0 years
15 - 17 Lacs
chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills java,data flow,sql,shell scripting,agile,cloud platform,google,gen,data processing,hive,sqoop,spark,hadoop,aws,big data,javascript,ansible,docker,jenkins,linux,microsoft azure,html,git,Google Cloud Platform*
Posted 3 weeks ago
11.0 - 16.0 years
0 Lacs
bengaluru, karnataka, india
On-site
It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal&aposs mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world&aposs most admired Fortune 500 companies. Fractal has consistently been rated as India&aposs best companies to work for, by The Great Place to Work Institute. Cloud, Data, and AI technologies are seeing tremendous innovation and are driving digital transformation across all the enterprises at an unprecedented pace (more than 100%). At Fractal, we are helping enterprises to harness the power of Data on a cloud through Architecture consulting, Data Platforms, Business-Tech platforms, Marketplaces, Data governance, MDM, DataOps, AI/MLOps. In addition, we leverage AI engineering practices to augment each of the areas. Responsibilities Evaluate the current technology landscape and recommend a forward-looking, short, and long-term technology strategic vision. You will participate in the creation and sharing of best practices, technical content, and new reference architectures Work with data engineers and data scientists to develop architectures & solutions. Assist in ensuring the smooth delivery of services/products, solutions. Skills & Requirements In depth experience as an Architect who has experience in Google Cloud Platform and is passionate about applying latest technologies to solve business problems. An ideal candidate would have: 11 - 16 years of experience in Data Engineering & Cloud Native technologies (including Google Cloud Platforms) that cover the big data, analytics and AI/ML domain is essential with experience using GCP Experience in BigQuery, Cloud Composer, Data Flow, Cloud Storage, AI Platform/Vertex AI, Dataproc, GCP IaaS Creating, deploying, configuring, and scaling applications on GCP serverless infrastructure. Knowledge and working experience in - Data Engineering, Data Management and Data Governance Experience of working in multiple End to End Data Engineering and or Analytics projects Knowledge of general programming languages and frameworks, in particular python and/or Java General technology best practices and development lifecycles such as agile and CI/CD but also those enabling more efficient employment of data and machine learning such as DevOps and MLOps. Technical architecture leadership and direction on projects resulting in secure, scalable, reliable, and maintainable platforms. Architecture skills that enable the creation of future-proof, complex global solutions using GCP services. Implementation and/or creation of foundational architectures, including microservices, event-driven and event streaming, and those that enable online machine learning systems. Excellent communication and influencing skills, being able to adapt according to target audience. Good To Have Experience in Container technology, specifically Docker and Kubernetes, DevOps on GCP Google Cloud - Professional Cloud Architect Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less
Posted 3 weeks ago
5.0 - 10.0 years
12 - 20 Lacs
bengaluru
Remote
Role & responsibilities Build and deploy data platforms, data warehouses, and big data solutions across industries (BFSI, Manufacturing, Healthcare, eCommerce, IoT, Digital Twin, etc) Integrating, transforming, and consolidating data from various structured and unstructured data systems. Expert in data ingestion, transformation, storage, and analysis, often using Azure services and migration from legacy on-premise services Essential skills include SQL, Python, R and knowledge of ETL/ELT processes and big data technologies like Apache Spark, Scala, PySpark. Maintain data integrity, resolve data-related issues, and ensure the reliability and performance of data solutions Work with stakeholders to provide real-time data analytics, monitor data pipelines, and optimize performance and scalability Strong understanding of data management fundamentals, data warehousing, and data modeling. Bigdata technologies HDFS, Spark, Hbase, Hive, Sqoop, Kafka, RabbitMQ, Flink Implement seamless data integration solutions between Azure/AWS/GCP and Snowflake platforms. Identify and resolve performance bottlenecks, optimize queries, and ensure the overall efficiency of data pipelines. Lead the development and management of data infrastructure, including tools, dashboards, queries, reports, and scripts, ensuring automation of recurring tasks while maintaining data quality and integrity Implement and maintain data security measures, ensuring compliance with industry standards and regulations. Ensure data architecture aligns with business requirements and best practices. Experience in Power BI /Tableau /Looker Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experience in Test Driven Development, building libraries, and proficiency in Pandas, NumPy, Elasticsearch, Apache Beam Familiarity with CI/CD pipelines, monitoring, and infrastructure-as-code (e. g., Terraform, CloudFormation). Proficient in query optimization, data partitioning, indexing strategies, and caching mechanisms. Ensure GDPR, SOX, and other regulatory compliance across data workflows Essential skills include SQL, Python, and knowledge of ETL/ELT processes and big data technologies like Spark. Exposure in working on Event/File/Table Formats such as Avro, Parquet, Iceberg, Delta Must have Skills (Atleast one or two) Azure Data Factory (ADF), Databricks, Synapse, Data Lake Storage, Timeseries Insights, Azure SQL Database, SQL Server, Presto, SSIS AWS data services such as S3, Glue Studio, Redshift, Athena, and EMR, Redshift , Airflow, IAM, DBT, Lambda, RDS, DynamoDB, Neo4j, Amazon Neptune GCP Big query, SQL, Composer, Dataflow, Dataform, DBT, /Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow, Cloud Storage, Pub/Sub, and Vertex AI, GKE, Cloud Functions OCI Object Storage, OCI Data Integration, Oracle Database, Oracle Analytics Cloud, Oracle Analytics Cloud (OAC), Autonomous Data Warehouse (ADW), NetSuite Analytics Warehouse (NSAW), PL/SQL, Exadata
Posted 3 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
noida, pune, bengaluru
Hybrid
Experience- 5+ years Location- Pan India GCP Bigquery Developer: Mandate Skills: GCP, Bigquery, Python, SQL Responsibilities: Good Knowledge of GCP, Bigquery, SQL, Python dataflow skills. Has worked in Implementation projects building data pipelines transformation logics and data models Proficient in dealing data access layer RDBMS, NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Implement data solutions using GCP and need to be familiar in programming with SQL, python. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 3 weeks ago
5.0 - 10.0 years
15 - 20 Lacs
chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,Big Query,Data Flow,Java
Posted 3 weeks ago
6.0 - 8.0 years
18 - 20 Lacs
chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Keywords Product Management,Product development,Product operations,GCS,DataProc,Big Query,Composer,Data Processing,Product Design*Mandatory Key Skills Product Management,Product development,Product operations,GCS,DataProc,Big Query,Composer,Data Processing,Product Design*
Posted 3 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics
Posted 3 weeks ago
7.0 - 12.0 years
10 - 14 Lacs
bengaluru
Work from Office
Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |