Jobs
Interviews

17 Datastore Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Engineering is mandatory. - Proficiency in Java/ Python/ Spark on GCP, with programming experience in Python, Java, or PySpark, SQL. - Hands-on experience with GCS (Cloud Storage), Composer (Airflow), and BigQuery. - Ability to work with handling big data efficiently. Your Profile: - Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. - Experience in pipeline development using Dataflow or Dataproc (Apache Beam etc). - Familiarity with other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions, etc. - Possess proven analytical skills and a problem-solving attitude. - Excellent communication skills. What you'll love about working here: - You can shape your career with a range of career paths and internal opportunities within the Capgemini group. - Access to comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its over 55-year heritage to unlock the value of technology for clients across the entire breadth of their business needs. The company delivers end-to-end services and solutions, combining strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, along with deep industry expertise and a strong partner ecosystem.,

Posted 3 days ago

Apply

3.0 - 5.0 years

2 - 3 Lacs

bengaluru, karnataka, india

On-site

As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization's financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible.

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 6 Lacs

bengaluru, karnataka, india

On-site

As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Service Reliability Engineer (SRE) role requires a mix of strategic engineering and design along with hands-on technical work. You will be responsible for configuring, tuning, and fixing multi-tiered systems to achieve optimal application performance, stability, and availability. Working closely with systems engineers, network engineers, database administrators, monitoring team, and information security team, you will play a key role in ensuring service level SLAs are met. Your responsibilities will include being the primary point of contact for the data pipeline involving Kafka, Hadoop, Cassandra, etc., and infrastructure components. You will also be expected to write, review, and develop code and documentation that addresses complex problems on large and sophisticated systems. Engaging and improving the lifecycle of services from inception to deployment, operation, migration, and sunsets will be a crucial part of your role. A passion for quality, automation, understanding complex systems, and constantly striving for improvement is essential. Setting priorities, working efficiently in a fast-paced environment, optimizing system performance, collaborating with geographically distributed teams, executing high-level projects and migrations, and demonstrating the ability to deliver results on time with high quality are key aspects of this role. If you enjoy designing and running systems and infrastructure that impact millions of users, this is the place for you. Key Responsibilities: - 4 years of experience in running services in a large scale Unix environment - Understanding of SRE principles, goals, and good on-call experience - Experience and understanding of Scaling, Capacity Planning, and Disaster Recovery - Fast learner with excellent analytical problem-solving and communication skills - Ability to design, author, and release code in languages like Go, Python, Ruby, or Java (bonus) - Deep understanding and experience in technologies such as Kubernetes, AWS, Ansible, Hadoop, Spark, Cassandra, Docker, Mesos, Spinnaker, Helm - Experience in supporting Java applications - Experience using monitoring and logging solutions like Prometheus, Grafana, Splunk, etc. - Familiarity with DNS, HTTP, message queues, queueing theory, RPC frameworks, and data store Additional Responsibilities: You will be part of a team that combines art and technology to deliver entertainment in over 35 languages to more than 150 countries, meeting high performance expectations. Engineers in this role build secure end-to-end solutions, develop custom software for processing creative work, tools for delivering media, server-side systems, and APIs for various services. The focus is on a single unified vision that includes a commitment to strengthening privacy policies, reflecting core values. While services are a significant part of the business, the teams maintain a small, nimble, and cross-functional structure, offering diverse opportunities. Preferred Skills: - Core Java - Scala - Spark - Kubernetes,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As the leading technology platform for accelerating life and annuities growth, Zinnia simplifies the experience of buying, selling, and administering insurance products. With a commitment to core values such as being bold, teaming up, and delivering value, Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Joining the data team at Zinnia means contributing to the company's goal of uncovering opportunities and making data-driven decisions. Working closely with stakeholders from various departments, the data team focuses on developing deeper predictors of behavior, insights that drive business strategy, and solutions to optimize internal and external experiences. Your responsibilities will include overseeing technological choices and implementing data pipelines and warehousing philosophy, leading cross-organizational projects automating data value chain processes, promoting technical best practices, designing simple and maintainable data architecture, mentoring team members, designing data pipelines, enforcing data governance and security, and partnering with product and engineering teams to maximize downstream data. To excel in this role, you should have extensive experience with data engineering techniques, Python, and SQL. Familiarity with tools like Airflow and dbt is required, along with expertise in data engineering tooling such as jira, git, buildkite, terraform, and containers. Understanding ETL patterns, modern data warehousing concepts, data quality practices, and a passion for all things data are essential. You should enjoy both high-level architecture and low-level coding, be proactive in learning and teaching, and be willing to take risks to find innovative solutions. Technologies you will use include Python, Airbyte, Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, Airflow, dbt, Tableau, and PowerBI. Join Zinnia's data team to be part of a culture of learning, innovation, and empowerment, where you can challenge yourself to constantly improve and contribute to delivering value from data to both internal and external clients.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data. Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What you'll love about working here .You can shape yourwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. .You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. .You will have theon one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are Our data team serves Zinnia through data engineering, data analysis, and data science. Our goal is to help uncover opportunities and make decisions with data. We partner with all department stakeholders across the company to develop deeper predictors of behavior, develop insights that drive business strategy and build solutions to optimize our internal and external experiences. What Youll Do Overseeing technological choices and implementation of data pipelines and warehousing philosophy Execute and serve as lead and/or SME on cross-organizational and cross-divisional projects automating our data value chain processes Promoting technical best practices throughout the data organization Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data. Mentor data team members in architecture and coding techniques. Serve as a source of knowledge for the Data Engineering team for process improvement, automation and new technologies available to enable best-in-class timeliness and data coverage Design data pipelines utilizing ETL tools, event driven software, and other streaming software. Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers. Ensure reliability in data pipelines and enforce data governance, security and protection of our customers information while balancing tech debt. Demonstrate innovation, customer focus, and experimentation mindsets Partner with product and engineering teams to design data models for downstream data maximization. Evaluate and champion new engineering tools that help us move faster and scale our team What Youll Need A Technical Bachelor/Master&aposs Degree with 5+ years of experience across Data Engineering (Data Pipelining, Warehousing, ETL Tools etc.) Extensive experience with data engineering techniques, Python and using SQL Familiarity and working knowledge of Airflow and dbt You are comfortable and have expertise in data engineering tooling such as Jira, git, buildkite, terraform, airflow, dbt and containers as well as GCP suite, terraform kubernetes, cloud functions You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability. You enjoy being a high-level architect sometimes, and a low-level coder sometimes You are passionate about all things data: Big data, small data, moving and transforming it, its quality, its accessibility, and delivering value from it to internal and external clients You want ownership to solve for and lead a team to deliver modern and efficient data pipeline components You are passionate about a culture of learning and teaching You love challenging yourself to constantly improve, and sharing your knowledge to empower others You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen Technologies, you will use: Python for data pipelining and automation. Airbyte for ETL purpose Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, and more: we keep adopting new tools as we grow! Airflow and dbt for data pipelining Tableau and PowerBI for data visualization and consumer facing dashboards. WHATS IN IT FOR YOU At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

vadodara, gujarat

On-site

We are seeking enthusiastic and ambitious individuals to join our motivated and results-oriented teams and grow together with Jeavio. This is a great opportunity for an experienced software engineer who is looking to take on a role as an individual contributor. Responsibilities: - Develop, maintain, and enhance Android applications aligned with the organization's objectives and user needs. - Collaborate with teams at the client side to understand the task objectives and create better mobile experiences. - Write clean, efficient, and reusable code that adheres to industry best practices and coding standards. - Ensure applications adhere to security and data privacy standards. - Troubleshoot issues arising in development, testing, or production environments. - Test applications to identify and fix bugs and performance bottlenecks. - Ability to navigate the learning curve associated with Android development. Requirements Mandatory Skills: - Good experience in Kotlin for Android development and related Tools. - Proficient in MVVM and Clean Architecture for maintainable apps. - Experience with Jetpack Compose for UI and XML for legacy views. - Hands-on with Dagger Hilt for Dependency Injection and efficient code management. - Experienced with Retrofit & OkHttp for REST API calls and Apollo Client for GraphQL APIs. - Familiar with Room for local databases and DataStore for preferences storage. - Skilled in Kotlin Coroutines and StateFlow/SharedFlow for background tasks and state management. - Good understanding and practical experience working with Kotlin collections and performing common operations using Kotlin's standard library functions. - Familiar with Play Store distribution processes. Nice-to-Have Skills: - Experience with Analytics and Monitoring Tools - Proficiency in Map SDKs and Geolocation Services - Understanding of Unit Testing and Test Automation - Experienced with Git, GitHub Actions, Bitrise, and Firebase App Distribution for version control and deployment. - Knowledge of OAuth 2.0 for secure authentication. - Understanding of Multi-module architecture.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

As a member of our team, you will have the opportunity to tackle various challenges in supporting and constructing highly available systems. You will work closely with teams in the U.S. and India, enhancing the team's capabilities to benefit the broader organization. Your responsibilities may include designing and implementing solutions to streamline manual operations, collaborating with other operational team members to address security and production issues, as well as conducting root cause analysis for critical incidents. Additionally, you will contribute to expanding the capacity and performance of existing operational systems. We are seeking a self-motivated, detail-oriented individual with a dynamic approach and a strong technical background. Minimum Qualifications: - 1-2 years of experience in software engineering - Bachelor's or Master's degree (or equivalent) in Computer Science or a related field - Knowledge of software engineering standard processes - Understanding of software architecture, deployment, and optimization of infrastructure in on-premises and third-party cloud environments - Proficiency in at least one programming and scripting language - Ability to troubleshoot and maintain various aspects of infrastructure, such as Compute, System, Network, Storage, and data store - Experience in implementing applications in private/public cloud infrastructure and utilizing container technologies like Kubernetes and Docker - Proficiency in developing software tooling for programmable infrastructure, creating environments, and establishing CI/CD pipelines using tools such as Terraform, CloudFormation, Ansible, and Kubernetes toolset (e.g., kubectl, kustomize) Preferred Qualifications: - Familiarity with build and deployment systems using Maven and GIT - Knowledge of observability tools like Grafana, Splunk, etc. - Interest or experience in automation is a plus - Self-motivated, independent, dedicated, with excellent organizational skills - Strong written and verbal communication skills If you meet the qualifications and are excited about the opportunity, we encourage you to submit your CV for consideration.,

Posted 1 month ago

Apply

0.0 years

13 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities: A day in the life of an Infoscion As part of the Infosys delivery team your primary role would be to provide best fit architectural solutions for one or more projects You would also provide technology consultation and assist in defining scope and sizing of work You would implement solutions create technology differentiation and leverage partner technologies Additionally you would participate in competency development with the objective of ensuring the best fit and high quality technical solutions You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines policies and norms of Infosys If you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you Technical Requirements: Primary skills Technology Data on Cloud DataStore Snowflake Additional Responsibilities: Knowledge of architectural design patterns performance tuning database and functional designs Hands on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description: A day in the life of an Infoscion As part of the Infosys delivery team your primary role would be to interface with the client for quality assurance issue resolution and ensuring high customer satisfaction You will understand requirements create and review designs validate the architecture and ensure high levels of service offerings to clients in the technology domain You will participate in project estimation provide inputs for solution delivery conduct technical risk planning perform code reviews and unit test plan reviews You will lead and guide your teams towards developing optimized high quality code deliverables continual knowledge management and adherence to the organizational guidelines and processes You would be a key contributor to building efficient programs systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you If you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you Key Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities Strong Technical Skills Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving analytical and debugging skills Technical Requirements: Technology Data on Cloud Datastore Cloud based Integration Platforms Informatica Intelligent Cloud services IICS Preferred Skills: Technology->Data on Cloud - Datastore->Cloud based Integration Platforms->Informatica Intelligent Cloud services(IICS)

Posted 2 months ago

Apply

9.0 - 14.0 years

22 - 37 Lacs

Pune

Hybrid

We're Hiring: Senior GCP Data Engineer (L4) for a client || Immediate joiners only Location: Pune | Walk-in Drive: 5th July 2025 Are you a seasoned Data Engineer with 912 years of experience and a passion for building scalable data solutions on Google Cloud Platform? Join us for an exciting walk-in opportunity! Key Skills Required GCP Data Engineering, BigQuery, SQL Python (Cloud Compressor, Cloud Functions, Python Injection) Dataproc + PySpark, Dataflow + Pub/Sub Apache Beam, Spark, Hadoop What You'll Do Architect and implement end-to-end data pipelines on GCP Work with BigQuery, BigTable, Cloud Storage, Spanner, and more Automate data ingestion, transformation, and augmentation Ensure data quality and compliance across systems Collaborate in a fast-paced, dynamic environment Bonus Points Google Professional Data Engineer or Solution Architect certification Experience with Snaplogic, Cloud Dataprep Strong SQL and data integration expertise If interested, Pls share your CV @ Raveena.kalra@in.ey.com

Posted 2 months ago

Apply

3.0 - 4.0 years

3 - 4 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

You will be responsible for developing mobile applications using the Android framework. You will work closely with the development team to design, develop, and implement innovative mobile solutions that meet our requirements. Roles and Responsibilities: Excellent debug skills . Design and build applications for the Android platform . Work with third-party libraries (Room DB), JSON data parsing, and Rest APIs implementation using Retrofit . Working experience using Design patterns like MVVM Architecture . Work on bug fixing and improving application performance . Work on Google Maps, Google Analytics, and Firebase (Real-time database and Push notification) . Work on Firebase Crashlytics or Sentry . Experience with Android SDK . Solid understanding of the entire mobile development life cycle . Medium understanding of Jetpack Compose Library and also work experience.

Posted 3 months ago

Apply

5.0 - 9.0 years

20 - 25 Lacs

Chennai

Work from Office

Skills Required: Python, Java, C/C++, Ruby, and JavaScript J2EE, NoSQL/SQL Datastore, Spring Boot, GCP/AWS/Azure & Docker/K8 RESTful APIs and microservices platform Experience with any of APM and other monitoring tools Exp 5+ CTC 28 LPA Chennai

Posted 3 months ago

Apply

3.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

You will be responsible for developing mobile applications using the Android framework. You will work closely with the development team to design, develop, and implement innovative mobile solutions that meet our requirements. Roles and responsibilities: Excellent debug skills. Design and build applications for the Android platform. Work with third-party libraries (Room DB), JSON data parsing, and Rest APIs implementation using Retrofit. Working experience using Design pattern like MVVM Architecture Work on bug fixing and improving application performance Work on Google map, Google Analytics, and Firebase (Real-time database and Push notification) Work on Firebase Crashlytics or Sentry. Experience with Android SDK Solid understanding of the entire mobile development life cycle. Medium understanding of Jetpack Compose Library and also work on experience. Skills: Core Java, Kotlin, Coroutines, DI(Dependency Injection), Datastore, (Room Database SQLite), Constraint layout, Live Data, MVVM Architecture, Version Control(Git), Communication, Coordination and teamwork

Posted 3 months ago

Apply

0.0 years

2 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Key Responsibilities: We are seeking a skilled and detail oriented Data Warehouse Engineer to design build and maintain scalable data warehouse solutions You will be responsible for developing efficient data pipelines integrating diverse data sources ensuring data accuracy and enabling high quality analytics to drive business decisions Responsibilities Design develop and maintain data warehouse architectures and systems Build robust ETL Extract Transform Load processes for structured and unstructured data sources Optimize data models database performance and storage solutions Collaborate with data analysts data scientists and business stakeholders to understand data requirements Implement data quality checks and ensure data governance best practices Develop and maintain documentation related to data warehouse design data flow and processes Monitor system performance and proactively identify areas for improvement Support ad hoc data requests and reporting needs Stay up to date with emerging data technologies and industry best practices Preferred Skills: Technology->ETL & Data Quality->ETL - Others,Technology->Database->Data Modeling,Technology->Data Management - DB->DB2,Technology->Data on Cloud-DataStore->Snowflake

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies