Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
6 - 16 Lacs
Kolkata, Bengaluru, Mumbai (All Areas)
Work from Office
Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements : Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics
Posted 2 weeks ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Work with the team in capacity of GCP Data Engineer on day to day activities Solve problems at hand with utmost clarity and speed Train and coach other team members Ability to turn around quickly Work with Data analysts and architects to help them solve any specific issues with tooling/processes Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake Design and build production data pipelines from ingestion to consumption within a big data architecture GCP BQ modeling and performance tuning techniques RDBMS and No-SQL database experience Knowledge on orchestrating workloads on cloud Implement Data warehouse & Big/Small data designs, data lake solutions with very good data quality capabilities Understanding and knowledge of deployment strategies CI/CD.
Posted 2 weeks ago
5.0 - 7.0 years
14 - 17 Lacs
Pune
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for experienced professionals with strong expertise in Google Cloud Platform (GCP) database services. The role involves designing, implementing, and troubleshooting scalable database solutions on GCP. Responsibilities: - Proven experience as a Subject Matter Expert in Google Cloud native databases and managed SQL solutions or a similar role. - In-depth knowledge of Google Cloud Platform (GCP) and its database tools, including Cloud SQL, BigQuery, and Spanner. - Strong analytical and problem-solving skills. - Excellent communication and presentation skills. - Proficiency in relevant programming languages such as SQL, Python, or Go. - Familiarity with cloud-native architectures and database best practices. - Provide technical expertise on GCP database tools. - Design and support cloud-native database architectures. - Resolve complex database issues. - Collaborate with cross-functional teams. Good to Have: - Google Cloud certifications. - Experience in DB migration. - Knowledge of data security/compliance.,
Posted 2 weeks ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise, digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our team to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact is seeking a highly accomplished and visionary Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect, specializing in Data and Artificial Intelligence. This pivotal role will be instrumental in driving Genpact%27s growth in the GCP ecosystem by leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering executive-level client relationships. You will operate at the intersection of business strategy and cutting-edge technology, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: . Executive Solutioning & Strategy: Lead the end-to-end technical pre-sales cycle for Genpact%27s most strategic data and AI opportunities on GCP. Engage at the CXO level and with senior business and IT stakeholders to deeply understand their strategic objectives, pain points, and competitive landscape. . Architectural Leadership: Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures on Google Cloud Platform. This includes expertise in BigQuery, Dataflow, Dataproc, Vertex AI (MLOps, Generative AI), Cloud AI services, Looker, Pub/Sub, Cloud Storage, Data Catalog, and other relevant GCP services. . Value Proposition & Storytelling: Develop and deliver highly impactful presentations, workshops, and proof-of-concepts (POCs) that clearly demonstrate the business value and ROI of Genpact%27s data and AI solutions on GCP. Craft compelling narratives that resonate with both technical and non-technical audiences. . Deal Ownership & Closure: Work collaboratively with sales teams to own the technical solutioning and commercial structuring of deals from qualification to closure. Lead the estimation, negotiation, and transition of deals to the delivery organization, ensuring alignment and seamless execution. . Technical Deep Dive & Expertise: Provide deep technical expertise on Google Cloud%27s Data & AI portfolio, staying at the forefront of new service offerings, product roadmaps, and competitive differentiators. Act as the subject matter expert in client discussions and internal enablement. . Cross-Functional Collaboration: Partner effectively with Genpact%27s sales, delivery, product development, and industry vertical teams to ensure that proposed solutions are innovative, deliverable, and aligned with market demands and Genpact%27s capabilities. . Thought Leadership: Contribute to Genpact%27s market presence and intellectual property through whitepapers, conference presentations, industry events, and client advisory sessions. Position Genpact as a leader in data-driven transformation on GCP. . Team Mentorship & Enablement: Provide mentorship and technical guidance to junior pre-sales architects and delivery teams, fostering a culture of continuous learning and excellence in GCP Data & AI. Qualifications we seek in you! Minimum Qualifications . progressive experience in data, analytics, artificial intelligence, and cloud technologies, with a strong focus on technical pre-sales, solution architecture, or consulting leadership roles. . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. . Deep and demonstrable expertise across the Google Cloud Data & AI stack: o Core Data Services: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. o AI/ML Services: Vertex AI (including MLOps, Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow, Recommendation AI. o BI & Visualization: Looker, Data Studio. . Proven track record of successfully leading and closing multi-million dollar deals involving complex data and AI solutions on cloud platforms. . Exceptional executive presence with the ability to engage, influence, and build trusted relationships with C-level executives and senior stakeholders. . Strong commercial acumen and experience in structuring complex deals, including pricing models, risk assessment, and contract negotiation. . Outstanding communication, presentation, and storytelling skills, with the ability to articulate complex technical concepts into clear, concise business benefits. . Demonstrated ability to lead cross-functional teams and drive consensus in dynamic and ambiguous environments. . Bachelor%27s degree in Computer Science, Engineering, or a related technical field. Master%27s degree or MBA preferred. . Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). . Ability to travel as required to client sites and internal meetings. Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
Objectives of this role The primary objective of this role is to design, develop, and maintain databases that meet the organization's requirements for storing and analyzing financial data. You will also be responsible for ensuring data integrity, security, and performance across various database platforms. Your tasks Design, develop, and optimize relational and non-relational databases to support the organization's financial data needs. Implement data models, schemas, and indexing strategies to optimize database performance and scalability. Collaborate with data engineering and software development teams to integrate database solutions into our applications and services. Perform database tuning, monitoring, and troubleshooting to ensure high availability and reliability. Implement data security measures, including access control and encryption, to protect sensitive financial information. Develop and maintain documentation for database design, configuration, and best practices. Stay current with emerging database technologies and trends to drive continuous improvement and innovation . You need to have Bachelor's degree in software engineering, Computer Science or a related field. Minimum 5+ years of experience as a database developer. Proven experience as a database developer or administrator, with expertise in relational databases such as MySQL and non-relational databases such as MongoDB, Elasticsearch, and Redis. Strong SQL skills and experience with database optimization techniques. Experience working with large datasets and complex data models in a financial or similar domain. Proficiency in database performance tuning, monitoring, and troubleshooting. Excellent problem-solving and analytical skills, with the ability to collaborate effectively in a team environment. Familiarity with data security best practices and compliance standards (e.g., GDPR, PCI DSS). Capability to work in multiple projects simultaneously. Experience with cloud-based database platforms such as Amazon RDS, Google Cloud SQL, or Azure Cosmos DB. Knowledge of distributed database systems and big data technologies (e.g., Hadoop, Spark). Experience with data warehousing solutions and ETL processes. Familiarity with DevOps practices and tools for database automation and CI/CD. Previous experience in the financial services industry or a similar regulated environment. About Us NSE Cogencis is a leading provider of data, news and actionable insights and analytics. Professionals across commercial banks, asset management companies, insurance companies, conglomerates and large corporate use our products to trade, to manage funds and hedge risks. As part of NSE Group and 100% subsidiary of NSE Data, we play an important role in Indian financial market ecosystem. Curiosity is our biggest asset and its in our DNA. Our curiosity to understand the market trends and challenges faced by todays market professional drives us to build and manage the most comprehensive database on Indian financial market, bring exclusive market moving news on our platform and continuously upgrade our analytical capability. It is CURIOSITY that drives everything we do at Cogencis. Together we learn, innovate and thrive professionally. We are an equal opportunity employer, and we strive to create a workplace that is not only employee friendly but puts our employees at the centre of our organisation. Wellbeing and mental health of our employees are a clear priority for us at NSE Cogencis.
Posted 2 weeks ago
5.0 - 7.0 years
6 - 7 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: This role is for a proactive Full Stack Software Engineer responsible for creating products to host Supply Chain Analytics algorithms. You will ensure software engineering excellence while developing web applications and tools, employing practices like pair programming and Test-Driven Development (TDD) within an Agile environment. Key responsibilities include acting as a change agent, mentoring teams on Agile methodologies, and contributing to Client's institutional knowledge. Strong written and oral communication skills are essential for interacting with Client leadership, along with a self-starting approach. Required Skills: Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related technical field. 5-7+ years of software engineering and testing experience, including Agile methodologies and Jira. Technical requirements include 3+ years in Python, Java, and Spring Boot development 3+ years with REST APIs; and 3+ years developing web-based UIs using JavaScript, React, Angular, Vue, or TypeScript, along with Pub Sub, APIGEE, and Cloud Storage. Experience with relational (e.g., PostgreSQL, SQL Server), NoSQL, and columnar databases (e.g., BigQuery) is necessary. At least 1 year of experience developing and deploying to cloud platforms such as Google Cloud Platform, Pivotal Cloud Foundry, Amazon Web Services, and Microsoft Azure is also required. A passion for clean code and a strong desire for continuous learning are key. Desired Skills: Full-stack expertise, automated testing (Unit, Integration, E2E), Cloud Computing/Infrastructure experience (especially Google Cloud Platform, Cloud Run containerization, and Google Cloud Storage), and proficiency with Continuous Integration/Continuous Delivery tools like Jenkins, Tekton, or Gradle. Skills Required: Big Query,, Python, Angular, Relational Databases, Google Cloud Platform, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 2 weeks ago
5.0 - 6.0 years
5 - 6 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer Practitioner Location: Chennai Work Type: Hybrid Position Description: We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataform,Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at client. Basic Qualifications: Bachelors or Masters degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Strong understating of Database concepts and experience with multiple database technologies optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud Google Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java- Experience programming engineering transformation in Python or a similar language. 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories Deep understanding of data service ecosystems including data warehousing, lakes and Marts User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Skills Required: Data Architecture, Data Warehousing, DataForm, Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, Dataform, PubSub Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted 2 weeks ago
7.0 - 12.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in
Posted 2 weeks ago
5.0 - 9.0 years
10 - 20 Lacs
Pune
Hybrid
Role & responsibilities Minimum of 5 years of experience in a DevOps, SRE, or Infrastructure Engineering role. • Solid understanding of Terraform and experience maintaining reusable module libraries. • Hands-on experience managing workloads on Kubernetes (preferably GKE). • Working knowledge of CI/CD tools such as GitHub Actions and Helm. • Familiarity with Google Cloud services, including networking, Cloud SQL (Postgres), and container security. • Competence in observability tooling, especially Datadog dashboards and alert configurations. • Strong operational mindset with attention to detail in release processes and deployment integrity. Desirable Experience • Exposure to GitOps tool. • Experience developing or integrating Kubernetes operators. • Familiarity with service-level indicators (SLIs), service-level objectives (SLOs), and structured alerting. Tools and Expectations • Terraform / HCP Terraform - Core to infrastructure provisioning. Required to build, refactor, and maintain reusable infrastructure modules across environments, enforce naming/tagging standards, and leverage state management for drift detection and rollback. • GitHub / GitLab / GitHub Actions - Central to CI/CD workflows. Expected to enforce secure release procedures, set up integration with code quality tools, and prevent direct changes to critical branches. • Helm - Used for Kubernetes application packaging and deployment. Must implement pre/post deployment logic, rollback plans, and chart lifecycle automation. • GKE / Kubernetes - Platform for hosting applications. The engineer must manage node pools, service networking, security contexts, and namespace segmentation. • GCP Services (CloudSQL, VPC, IAM) - Backend for infrastructure workloads.
Posted 2 weeks ago
3.0 - 4.0 years
3 - 7 Lacs
Mumbai
Work from Office
Job Summary We are seeking an experienced and motivated Data Engineer to join our growing team, preferably with experience in the Banking, Financial Services, and Insurance (BFSI) sector. The ideal candidate will have a strong background in designing, building, and maintaining robust and scalable data infrastructure. You will play a crucial role in developing our data ecosystem, ensuring data quality, and empowering data-driven decisions across the organization. This role requires hands-on experience with the Google Cloud Platform (GCP) and a passion for working with cutting-edge data technologies. Responsibilities Design and Develop End-to-End Data Engineering Pipelines: Build, and maintain scalable and reliable data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. Implement Data Quality and Governance: Establish and enforce processes for data validation, transformation, auditing, and reconciliation to ensure data accuracy, completeness, and consistency. Build and Maintain Data Storage Solutions: Design, implement, and manage data vault and data mart to support business intelligence, analytics, and reporting requirements. Orchestrate and Automate Workflows: Utilize workflow management tools to schedule, monitor, and automate complex data workflows and ETL processes. Optimize Data Infrastructure: Continuously evaluate and improve the performance, reliability, and cost-effectiveness of our data infrastructure and pipelines. Collaborate with Stakeholders: Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and deliver effective data solutions. Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and architectures. Key Skills Python: Proficient in Python for data engineering tasks, including scripting, automation, and data manipulation. PySpark: Strong experience with PySpark for large-scale data processing and analytics. SQL: Expertise in writing complex SQL queries for data extraction, transformation, and analysis. Tech Stack (Must Have) Google Cloud Platform (GCP): Dataproc: For managing and running Apache Spark and Hadoop clusters. Composer (Airflow): For creating, scheduling, and monitoring data workflows. Cloud Functions: For event-driven serverless data processing. Cloud Run: For deploying and scaling containerized data applications. Cloud SQL: For managing relational databases. BigQuery: For data warehousing, analytics, and large-scale SQL queries. Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3+ years of proven experience in a Data Engineer role. Demonstrable experience with the specified "must-have" tech stack. Strong problem-solving skills and the ability to work independently and as part of a team. Excellent communication and interpersonal skills. Good to Have Experience in the BFSI (Banking, Financial Services, and Insurance) domain. Apache NiFi: Experience with data flow automation and management. Qlik: Familiarity with business intelligence and data visualization tools. AWS: Knowledge of Amazon Web Services data services. DevOps and FinOps: Understanding of DevOps principles and practices (CI/CD, IaC) and cloud financial management (FinOps) to optimize cloud spending.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. In this role, you will be a senior contractor engaged on a 2.5-month remote assignment with the potential to extend. We are looking for candidates with required skills who can work independently as well as within a team environment. Your responsibilities will include facilitating, guiding, and influencing the client and teams towards an effective architectural pattern. You will become an interface between business leadership, technology leadership, and the delivery teams. Additionally, you will perform Migration Assessments and Produce Migration Plans that encompass Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, Application Waves, designing solution architecture on Google Cloud to support critical workloads, and Heterogeneous Oracle Migrations to Postgres or Spanner. You will design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users, and Security. Your role will also involve overseeing migration activities and providing troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, Technology reviews, and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy and handles personal information in accordance with the California Consumer Privacy Act (CCPA).,
Posted 2 weeks ago
5.0 - 9.0 years
9 - 18 Lacs
Bengaluru
Hybrid
Job Description 5+ yrs of IT experience Good understanding of analytics tools for effective analysis of data Should be able to lead teams Should have been part of the production deployment team, Production Support team Experience with Big Data tools- Hadoop, Spark, Apache Beam, Kafka, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Experience with any DW tools like BQ, Redshift, Synapse, or Snowflake Experience in ETL and Data Warehousing. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra, etc. Experience with cloud platforms like AWS, GCP and Azure. Experience with workflow management using tools like Apache Airflow. Roles & Responsibilities Develop high-performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on the GCP cloud using GCP data stores such as BigQuery Should be able to handle the deployment process Optimizing data pipelines for performance and cost for large-scale data lakes. Writing complex, highly optimized queries across large data sets and creating data processing layers. Closely interact with Data Engineers to identify the right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs, and other Data/ML engineers Research new use cases for existing data. Preferred: Need to be Aware of Design Best practices for OLTP and OLAP Systems Should be part of team designing the DB and pipeline Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling Worked on heterogeneous migration projects
Posted 3 weeks ago
4.0 - 9.0 years
10 - 18 Lacs
Chennai
Hybrid
Role & responsibilities Bachelors Degree 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strongexperience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API
Posted 3 weeks ago
8.0 - 13.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Your Impact: We are looking for an experienced PostgreSQL database administrator who will be responsible for the performance, availability, security, and backup/recovery of clusters of PostgreSQL instances, along with the opportunity to learn to support Oracle and/or MS SQL instances. What the role offers: Setup and manage highly available Crunchy Data HA-based PostgreSQL clusters. Patch, upgrade, and maintain PostgreSQL software. Implement minimal downtime database upgrade using various technologies. Design and implement application specific data migration solution for database upgrade to minimize customer impact. Establish PostgreSQL best practices across various deployments Act as a Tech Lead within the team to drive our PostgreSQL delivery roadmap and strategy Pro-actively review database metrics, identify bottleneck, and tune the database/query. Configure and customize monitoring configurations for PostgreSQL databases. Implement backup/recovery strategies with point-in-time restore capability to meet customer's SLA. Periodically perform data restore to ensure recoverability. Implement/maintain data replication to disaster recovery environment and execute disaster recovery exercise annually. Automate routine tasks such as software installation, standby database validation, log rotation, security auditing. Develop and maintain documented procedures to ensure consistent and effective database operations in the team. Respond to page-outs as part of on-call rotation, perform incident recovery, root cause analysis, and identify and implement corrective actions. Act as PostgreSQL SME supporting your peers to provide expertise, input, and insights as needed Support database environments used by customer-facing OpenText applications aligned to our multi-tenant SaaS stack of products What you need to succeed: Bachelors Degree in Computer Engineering or related Should have at least 8 years of Information technology experience 3+ years of PostgreSQL operations experience Expert skills in setting up and managing PostgreSQL HA environment Expert skills in PostgreSQL troubleshooting and performance management Expert skills in PostgreSQL backup/recovery Strong Unix skills, especially in writing automation scripts for remote execution Proficiency in writing and optimizing SQL statements Ability to thrive in a fast-paced environment working on projects against strict deadlines Experience supporting enterprise level database environments Additional Value-Added Qualifications: Skills in Oracle database administration Skills in MS SQL administration Experience with Terraform, Ansible, or other automation technologies Experience with GCP CloudSQL or AWS RDS services Strong understanding of ITIL principles, certification is a plus Experience with database monitoring through tools such as Nagios, Zabbix, or New Relic.
Posted 3 weeks ago
2.0 - 7.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Provision MySQL instances, both in clustered and non-clustered configurations Ensure performance, security, and availability of databases Work with the various Engineering groups and ensure database changes are in-line with operational standards and meet the strategies needed to scale Data Mining and Data Analysis. Prepare documentations and specifications Handle common database procedures, such as upgrade, backup, recovery, migration, etc. Profile server resource usage, optimize and tweak as necessary Collaborate with other team members and stakeholders Skills and Qualifications Strong experience in writing SQL queries, Cloud SQL, procedures and functions. (Mandatory) Experience in administering MySQL replication, configuration, and deployment strategies (Mandatory) Should have experience in data modeling and database design Experience in scripting preferred ( shell, Python, etc ) Must be highly proficient in all aspects of database administration, including backup/recovery/replication, clustering, advanced performance tuning, and proactive monitoring Experience in handling databases as a service when production environments are in the cloud. Metadata management and repository usage Ensuring data integrity & Performance management and tuning General systems management and networking skills. Strong Knowledge with database architecture design, including data partitioning Strong understanding of distributed systems, different levels of data consistency Experience in any NoSQL database such as MongoDB is preferred. Expert knowledge in maintaining, building, supporting, tuning, and monitoring production in MySQL database servers. Understand data locking concepts and the different levels of locking in MySQL
Posted 3 weeks ago
7.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
7.0 - 12.0 years
45 - 55 Lacs
Bengaluru
Work from Office
: Job TitleLead Solution Architect, VP LocationBangalore, India Role Description We are seeking a highly skilled and experienced Solution Architect to join CM Tech team owning firmwide golden reference data source cRDS. As a Solution Architect, you will play a pivotal role in shaping the future of CM Tech architecture, leading the development of innovative technical solutions, and contributing to the strategic direction of the application. You will be responsible for defining, documenting, and implementing the overall architecture of cRDS and other client onboarding applications, ensuring its scalability, performance, and security while aligning with business requirements and industry best practices. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define CMT contribution to RCP solutions. Scope solutions to existing and new CMT components. Capture and document assumptions made in lieu of requirement / information for PO to risk accept. Define high-level data entities, functional decomposition. Support component guardians in aligning component roadmap to product strategy and initiative demand. Work with 'CTO' function to define and document. Outline define CMT and non-CMT component interactions and interaction contracts for refinement by engineering teams. Identify problems and opportunities - form business case - propose solutions. Definition of phased transitions from current state to target state. Ensure non-functional requirements are considered and include projection. Ensure Authentication and Authorisation are considered. Ensure solution design is suitable to build estimation and groomable Jiras from. Provide guardrails on what requirements a component should and should not cover - act as point of escalation. Hands-on software development Knowledge of solution design and Architecting Experience in Agile and Scrum delivery. Should be able to contribute towards good software design. Participate in daily stand-up meetings. Strong communication with stakeholders Articulate issues and risks to management in timely manner Train and mentor junior team members to bring them up to speed. Your skills and experience Must have (Strong technical knowledge required) 7+ years of experience in designing and implementing complex enterprise-scale applications. Proven experience in designing and implementing microservices architectures. Deep understanding of distributed systems and cloud-native technologies. Experience with architectural patterns like event-driven architectures, API gateways, and message queues. Strong understanding of Java Core concepts, design patterns, and best practices. Experience with Spring Boot framework, including dependency injection, Spring Data, and Spring Security. Hands-on experience with a BPM tool (Camunda preferred), including process modeling, workflow automation, and integration with backend systems. Experience with Google Cloud Platform, including services like Cloud Run, Cloud SQL, and Cloud Storage desirable. Experience with containerization technologies like Docker and Kubernetes. Strong SQL knowledge and experience with advanced database concepts, including relational database design, query optimization, and transaction management. Experience with version control systems like Git and collaborative development tools like Jira and Confluence. Excellent communication and presentation skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Strong problem-solving skills, with the ability to analyze complex business problems and propose innovative technical solutions. Experience in collaborating with stakeholders, understanding their needs, and translating them into technical solutions. Technical leadership skills and experience mentoring junior engineers. Nice to have Experience with cloud technologies such as Docker, Kubernetes, Openshift, Azure, AWS, GCP Additional languages such as Kotlin, scala & Python Experience with Big data / Streaming technologies Experience with end to end design and delivery of solutions Experience with UI frameworks like Angular or React RDBMS /Oracle design, development, tuning Sun/Oracle or architecture specific certifications How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 3 weeks ago
8.0 - 13.0 years
15 - 20 Lacs
Bengaluru, Karnataka, India
On-site
Srs Business Solutions India is looking for an experienced GCP Cloud Engineer to join our team in Bangalore. We need a professional with extensive experience in building modern cloud applications on Google Cloud Platform and a strong foundation in software engineering best practices. This is an excellent opportunity for immediate joiners who are ready to make an impact. Key Responsibilities Build modern applications primarily utilizing a range of GCP services, including: Cloud Build Cloud Functions / Cloud Run Google Kubernetes Engine (GKE) Logging Google Cloud Storage (GCS) CloudSQL Identity and Access Management (IAM) Demonstrate in-depth knowledge and hands-on experience with GKE/Kubernetes . Apply strong Software Engineering fundamentals , including: Code and configuration management CI/CD (Continuous Integration/Continuous Delivery) / Automation Automated testing Collaborate effectively with operations, security, compliance, and architecture groups to develop secure, scalable, and supportable cloud solutions. Skills & Qualifications Experience: 8+ years of overall experience, with at least 3 years specifically building modern applications utilizing GCP services . Primary proficiency in Python , with experience in a secondary language such as Golang or Java . High emphasis on Software Engineering fundamentals . Ability to work effectively with various cross-functional teams. Note: We are specifically seeking immediate joiners (candidates whose notice period is served or are currently serving their notice period). Please apply only if you meet this criterion.
Posted 3 weeks ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.
Posted 3 weeks ago
8.0 - 10.0 years
14 - 18 Lacs
Chennai
Work from Office
GCP Architect A Seasoned architect with a minimum of 12+ years and designing medium to large scale Application-to-Application integration requirements leveraging API, APIMs, ESB, product-based hybrid implementation.. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Experience/Exposure for Openshift & PCF on GCP & DevSecOps will be an added advantage Ability to make critical solution design decisions Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Need to have an understanding and designed integration platform to meet the NFR requirements. Should have implemented design patterns like integrating with multiple COTS applications, integrations with multiple databases (SQL based and also NoSQL) Have worked with multiple teams to gather integration requirements, create Integration specification documents, map specifications, write high level and detailed designs, guiding the technical team for design and implementation. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience8-10 Years.
Posted 3 weeks ago
10.0 - 20.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity
Posted 3 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat In your role, you will be responsible for Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core TechnologiesOSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough