Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
12 - 22 Lacs
Gurugram
Work from Office
To Apply - Submit Details via Google Form - https://forms.gle/8SUxUV2cikzjvKzD9 As a Senior Consultant in our Consulting team, youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations Seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Role & responsibilities 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Preferred candidate profile 1. Bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies
Posted 1 month ago
4.0 - 9.0 years
12 - 22 Lacs
Gurugram, Bengaluru
Work from Office
To Apply - Submit Details via Google Form - https://forms.gle/8SUxUV2cikzjvKzD9 As a Senior Consultant in our Consulting team, youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations Seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Role & responsibilities 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Preferred candidate profile 1. Bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Overview As an Analyst, Data Modeler, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Bachelors degree required in Computer Science, Data Management/Analytics/Science, Information Systems, Software Engineering or related Technology Discipline. 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Familiarity with business intelligence tools (such as Power BI). Excellent verbal and written communication and collaboration skills.
Posted 1 month ago
3.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
4.0 - 7.0 years
15 - 27 Lacs
Chennai
Work from Office
We're Nagarro , We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 3+ years. Excellent knowledge and experience in Big data engineer. Strong experience with AWS services , especially S3, Glue, Athena, and EMR. Hands-on programming experience in Python, Spark, Pyspark and SQL . Proficiency in working with data warehouses such as Amazon Redshift or Snowflake . Experience handling structured and semi-structured data. Strong understanding of ETL/ELT processes and data transformation techniques. Proven experience in cross-functional collaboration with technical and business teams. Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. Experience with creating technical documentation and solution designs. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have minimum 4 years of relevant experience Proficient in Python with hands-on experience building ETL pipelines for data extraction, transformation, and validation. Strong SQL skills for working with structured data. Familiar with Grafana or Kibana for data visualization and monitoring/dashboards. Experience with databases such as MongoDB, Elasticsearch, and MySQL. Comfortable working in Linux environments using common Unix tools. Hands-on experience with Git, Docker and virtual machines.
Posted 1 month ago
3.0 - 7.0 years
2 - 6 Lacs
Hyderabad, Pune, Gurugram
Work from Office
Location Pune, Hyderabad, Gurgaon, Bangalore [Hybrid] : Python, PySpark, SQL, IAM, CloudFormation, StepFunctions, and Redshift. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.
Posted 1 month ago
3.0 - 7.0 years
2 - 6 Lacs
Hyderabad, Pune, Gurugram
Work from Office
Location Pune, Hyderabad, Gurgaon, Bangalore [Hybrid] : Python, Pyspark, SQL, AWS Services - AWS Glue, S3, IAM, Athena, AWS CloudFormation, AWS Code Pipeline, AWS Lambda, Transfer Family, AWS Lake Formation, and CloudWatch, CI/CD automation of AWS CloudFormation stacks Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Posting TitleBUSINESS INTELLIGENCE ANALYST I Band/Level5-4-S Education ExperienceBachelors Degree (High School +4 years) Employment ExperienceLess than 1 year At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. Job Overview TE Connectivity s Business Intelligence Teams are responsible for the processing, mining and delivery of data to their customer community through repositories, tools and services. Roles & Responsibilities Tasks & Responsibilities Assist in the development and deployment of Digital Factory solutions and Machine Learning models across Manufacturing, Quality, and Supply Chain functions. Support data collection, cleaning, preparation, and transformation from multiple sources, ensuring data consistency and readiness. Contribute to the creation of dashboards and reports using tools such as Power BI or Tableau. Work on basic analytics and visualization tasks to derive insights and identify improvement areas. Assist in maintaining existing ML models, including data monitoring and model retraining processes. Participate in small-scale PoCs (proof of concepts) and pilot projects with senior team members. Document use cases, write clean code with guidance, and contribute to knowledge-sharing sessions Support integration of models into production environments and perform basic testing. Desired Candidate Proficiency in Python and/or R for data analysis, along with libraries like Pandas, NumPy, Matplotlib, Seaborn. Basic understanding of statistical concepts such as distributions, correlation, regression, and hypothesis testing. Familiarity with SQL or other database querying tools; e.g., pyodbc, sqlite3, PostgreSQL. Exposure to ML algorithms like linear/logistic regression, decision trees, k-NN, or SVM. Basic knowledge of Jupyter Notebooks and version control using Git/GitHub. Good communication skills in English (written and verbal), able to explain technical topics simply Collaborative, eager to learn, and adaptable in a fast-paced and multicultural environment. Exposure to or interest in manufacturing technologies (e.g., stamping, molding, assembly). Exposure to cloud platforms (AWS/Azure) or services like S3, SageMaker, Redshift is an advantage. Hands-on experience in image data preprocessing (resizing, Gaussian blur, PCA) or computer vision projects. Interest in AutoML tools and transfer learning techniques. Competencies ABOUT TE CONNECTIVITY TE Connectivity plc (NYSETEL) is a global industrial technology leader creating a safer, sustainable, productive, and connected future. Our broad range of connectivity and sensor solutions enable the distribution of power, signal and data to advance next-generation transportation, energy networks, automated factories, data centers, medical technology and more. With more than 85,000 employees, including 9,000 engineers, working alongside customers in approximately 130 countries, TE ensures that EVERY CONNECTION COUNTS. Learn more at www.te.com and on LinkedIn , Facebook , WeChat, Instagram and X (formerly Twitter). WHAT TE CONNECTIVITY OFFERS: We are pleased to offer you an exciting total package that can also be flexibly adapted to changing life situations - the well-being of our employees is our top priority! Competitive Salary Package Performance-Based Bonus Plans Health and Wellness Incentives Employee Stock Purchase Program Community Outreach Programs / Charity Events IMPORTANT NOTICE REGARDING RECRUITMENT FRAUD TE Connectivity has become aware of fraudulent recruitment activities being conducted by individuals or organizations falsely claiming to represent TE Connectivity. Please be advised that TE Connectivity never requests payment or fees from job applicants at any stage of the recruitment process. All legitimate job openings are posted exclusively on our official careers website at te.com/careers, and all email communications from our recruitment team will come only from actual email addresses ending in @te.com . If you receive any suspicious communications, we strongly advise you not to engage or provide any personal information, and to report the incident to your local authorities. Across our global sites and business units, we put together packages of benefits that are either supported by TE itself or provided by external service providers. In principle, the benefits offered can vary from site to site. Location
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
India, Bengaluru
Work from Office
Senior Data Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Senior Data Engineer - Enterprise Data Platform Get to know Data Engineering Okta s Business Operations team is on a mission to accelerate Okta s scale and growth. We bring world-class business acumen and technology expertise to every interaction. We also drive cross-functional collaboration and are focused on delivering measurable business outcomes. Business Operations strives to deliver amazing technology experiences for our employees, and ensure that our offices have all the technology that is needed for the future of work. The Data Engineering team is focused on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Senior Data Engineer Opportunity A Senior Data Engineer is responsible for designing, building, and maintaining scalable solutions. This role involves collaborating with data engineers, analysts, scientists and other engineers to ensure data availability, integrity, and security. The ideal candidate will have a strong background in cloud platforms, data warehousing, infrastructure as code, and continuous integration/continuous deployment (CI/CD) practices. What you ll be doing: Design, develop, and maintain scalable data platforms using AWS, Snowflake, dbt, and Databricks. Use Terraform to manage infrastructure as code, ensuring consistent and reproducible environments. Develop and maintain CI/CD pipelines for data platform applications using GitHub and GitLab. Troubleshoot and resolve issues related to data infrastructure and workflows. Containerize applications and services using Docker to ensure portability and scalability. Conduct vulnerability scans and apply necessary patches to ensure the security and integrity of the data platform. Work with data engineers to design and implement Secure Development Lifecycle practices and security tooling (DAST, SAST, SCA, Secret Scanning) into automated CI/CD pipelines. Ensure data security and compliance with industry standards and regulations. Stay updated with the latest trends and technologies in data engineering and cloud platforms. What we are looking for: BS in Computer Science, Engineering or another quantitative field of study 5+ years in a data engineering role 5+ years experience working with SQL, ETL tools such as Airflow and dbt, with relational and columnar MPP databases like Snowflake or Redshift, hands-on experience with AWS (e.g., S3, Lambda, EMR, EC2, EKS) 2+ years of experience managing CI/CD infrastructures, with strong proficiency in tools like GitHub Actions, Jenkins, ArgoCD, GitLab, or any CI/CD tool to streamline deployment pipelines and ensure efficient software delivery. 2+ years of experience with Java, Python, Go, or similar backend languages. Experience with Terraform for infrastructure as code. Experience with Docker and containerization technologies. Experience working with lakehouse architectures such as Databricks and file formats like Iceberg and Delta Experience in designing, building, and managing complex deployment pipelines. "This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment." What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply
Posted 1 month ago
2.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
WFM Lead Specialist About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. The Target Enterprise Services (TES) organization is close to the action when it comes to communication whether with guests or Target team members. From guest service professionals and product designers to vendor managers and financial and workforce management analysts, TES comprises several key and high-visibility areas that elevate and nurture Target s distinctive reputation. We cultivate loyalty and satisfaction through exceptional service and support. And we foster a culture of responsive, knowledgeable and committed service from the inside out through enterprise services our people can count on. Beyond our world-class service centers, there are many important challenges to be met in other TES teams like the TES Operations and Product Team, which plays at the intersections of process and technology, and Service Delivery Enablement which develops comprehensive service delivery strategies for our service centers. TES Product Design manages and grows loyalty, frequency and other marketing programs for all Payment Cards (FPSC, Gift Card). Our Bank Program and included credit risk and compliance functions manage Merchandise Finance, Capital Finance, Expense Management and Financial Goals and Forecasts. And the TES Controller heads up TES Accounting and Financial Operations, including Accounting and Control, Retail Bankcard Services, Target Card Services, Fraud Prevention & Dispute Resolution. As a WFM Lead Specialist , you will manage contacts and team member resources for Target service centers. Job duties may change anytime due to business needs As a WFM Lead Specialist, you will manage contacts and team member resources for Target service centers You will use strong critical thinking and decision-making skills to ensure that service level and productivity metrics are achieved You will provide detailed communication, both written and verbal, throughout the day to Workforce Management business partners and Service Center leaders Responsibilities include managing service level performance for all service centers by analyzing Intra-day volume trends, agent skills, and schedule effectiveness to develop strategies to provide top level service You will effectively execute contingency plans in the face of unexpected workflow changes or contact arrival patterns and sharing out Workforce Management system data with integrity and accuracy You will identify and routinely link with Target partners whose activities may impact volume so these can be factored into forecasts to avoid unexpected volume spikes resulting in poor guest service You will collect metrics on the service and staffing on service centers, analyze this data to determine what trends exist, and share meaningful insights with service center partners REPORTING/WORKING RELATIONSHIPS Reports to the Manager WFM TII Close partnerships with TES Service Center, HQ & TII SHIFT REQUIREMENTS: Able to work on holidays, and weekends 45 Hours/Week with any two consecutive weekly offs Rotational Shifts Rotational Shifts 24/7 primarily in evening and night shifts MINIMUM REQUIREMENTS Four year college degree or equivalent with 2+ yrs. Service Center experience Strong critical thinking and decision-making skills Demonstrated ability to work independently, take initiative and handle multiple tasks Strong technical skills, ability to work within multiple systems and proficiency MS Power Point, Advance excel, Data visualization tools Ability to prioritize responsibilities, work under pressure and within time constraints
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
India, Bengaluru
Work from Office
Senior Full Stack Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box, we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. The Data Engineering Team Our focus is on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Full Stack Engineer Opportunity This role is responsible for designing and developing scalable solutions for our Business Intelligence stacks in a fast-paced, Agile environment. Experience with Frontend Development and UX design Experience with containerization and orchestration (Docker, Kubernetes). Knowledge of DevOps practices and tools. Previous experience in Agile/Scrum development methodologies. What you ll be doing Building and maintaining user-facing applications, ensuring they are performant, responsive, and user-friendly. Collaborate with UX/UI designers to implement visually appealing and intuitive designs and backend developers to showcase data-intensive webpages. Conduct thorough testing and debugging to ensure the quality and performance of the software. Implement seamless integration of microservices with the front-end. Implement security best practices to protect sensitive data and ensure system integrity. Deploy applications to production environments and manage the deployment process. Monitor system performance and address any issues promptly to ensure a smooth user experience. Collaborate with cross-functional teams, including product managers, designers, and other developers, to deliver end-to-end solutions. Monitor system performance and address any issues promptly to ensure a smooth user experience. What you ll bring to the role BS in Computer Science, Engineering or another quantitative field of study 3+ years of experience with frontend development. Strong proficiency in developing UI components using client-side framework ReactJS Extensive experience in web fundamentals like HTML 5 and CSS 3 Solid understanding of full web technology stackRESTful services, client-side frameworks, data persistence technologies and security Expert in Relational Database design, implementation, queries, and reporting (DDL, SQL) Experience working with SQL, ETL tools such as Airflow, with relational and columnar MPP databases like Snowflake, Athena or Redshift Excellent oral and written communication skills, both technical and non-technical audience And extra credit if you have experience in any of the following! Strong knowledge of cloud computing platforms like AWS, including serverless services and infrastructure as code using terraform Experience with cloud infrastructure/platforms (AWS, Azure, Google Cloud Platform) and Data Lake development Experience with packaging and distributing containerized applications using Docker and Kubernetes Experience of developing microservices What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_1634_JOB Date Opened 12/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title AWS-BIGDATA-DEVELOPER City Bangalore Province Karnataka Country India Postal Code 560001 Number of Positions 4 Roles and Responsibilities: Experience in GLUE AWS Experience with one or more of the followingSpark, Scala, Python, and/or R . Experience in API development with NodeJS Experience with AWS (S3, EC2) or other cloud provider Experience in Data Virtualization tools like Dremio and Athena is a plus Should be technically proficient in Big Data concepts Should be technically proficient in Hadoop and noSQL (MongoDB) Good communication and documentation skills check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
8.0 - 12.0 years
4 - 8 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1581_JOB Date Opened 25/11/2022 Industry Technology Job Type Work Experience 8-12 years Job Title Senior Specialist- Data Engineer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 Location:Pune/ Mumbai/ Bangalore/ Chennai Roles & Responsibilities: Total 8-10 years of working experience Experience/Needs 8-10 Years of experience with big data tools like Spark, Kafka, Hadoop etc. Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc. Design, build, test and deploy streaming pipelines for data processing in real time and at scale Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc.. Experience with object-oriented/object function scripting languagesScala, Java, etc. Develop software systems using test driven development employing CI/CD practices Partner with other engineers and team members to develop software that meets business needs Follow Agile methodology for software development and technical documentation Good to have banking/finance domain knowledge Strong written and oral communication, presentation and interpersonal skills. Exceptional analytical, conceptual, and problem-solving abilities Able to prioritize and execute tasks in a high-pressure environment Experience working in a team-oriented, collaborative environment 8-10 years of hand on coding experience Proficient in Java, with a good knowledge of its ecosystems Experience with writing Spark code using scala language Experience with BigData tools like Sqoop, Hive, Pig, Hue Solid understanding of object-oriented programming and HDFS concepts Familiar with various design and architectural patterns Experience with big data toolsHadoop, Spark, Kafka, fink, Hive, Sqoop etc. Experience with relational SQL and NoSQL databases like MySQL, PostgreSQL, Mongo dB and Cassandra Experience with data pipeline tools like Airflow, etc. Experience with AWS cloud servicesEC2, S3, EMR, RDS, Redshift, BigQuery Experience with stream-processing systemsStorm, Spark-Streaming, Flink etc. Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Expertise in design / developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
12.0 - 15.0 years
13 - 17 Lacs
Mumbai
Work from Office
Job Information Job Opening ID ZR_1688_JOB Date Opened 24/12/2022 Industry Technology Job Type Work Experience 12-15 years Job Title Big Data Architect City Mumbai Province Maharashtra Country India Postal Code 400008 Number of Positions 4 LocationMumbai, Pune, Chennai, Hyderabad, Coimbatore, Kolkata 12+ Years experience in Big data Space across Architecture, Design, Development, testing & Deployment, full understanding in SDLC. 1. Experience of Hadoop and related technology stack experience 2. Experience of the Hadoop Eco-system(HDP+CDP) / Big Data (especially HIVE) Hand on experience with programming languages such as Java/Scala/python Hand-on experience/knowledge on Spark3. Being responsible and focusing on uptime and reliable running of all or ingestion/ETL jobs4. Good SQL and used to work in a Unix/Linux environment is a must.5. Create and maintain optimal data pipeline architecture.6. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.7. Good to have cloud experience8. Good to have experience for Hadoop integration with data visualization tools like PowerBI. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
6.0 - 10.0 years
3 - 7 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_2199_JOB Date Opened 15/04/2024 Industry Technology Job Type Work Experience 6-10 years Job Title Sr Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 4 Strong experience in Python Good experience in Databricks Experience working in AWS/Azure Cloud Platform. Experience working with REST APIs and services, messaging and event technologies. Experience with ETL or building Data Pipeline tools Experience with streaming platforms such as Kafka. Demonstrated experience working with large and complex data sets. Ability to document data pipeline architecture and design Experience in Airflow is nice to have To build complex Deltalake check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
6.0 - 10.0 years
1 - 4 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1594_JOB Date Opened 29/11/2022 Industry Technology Job Type Work Experience 6-10 years Job Title AWS GLUE Engineer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 Roles & Responsibilities: Provides expert level development system analysis design and implementation of applications using AWS services specifically using Python for Lambda Translates technical specifications and/or design models into code for new or enhancement projects (for internal or external clients). Develops code that reuses objects is well-structured includes sufficient comments and is easy to maintain Provides follow up Production support when needed. Submits change control requests and documents. Participates in design code and test inspections throughout the life cycle to identify issues and ensure methodology compliance. Participates in systems analysis activities including system requirements analysis and definition e.g. prototyping. Participates in other meetings such as those for use case creation and analysis. Performs unit testing and writes appropriate unit test plans to ensure requirements are satisfied. Assists in integration systems acceptance and other related testing as needed. Ensures developed code is optimized in order to meet client performance specifications associated with page rendering time by completing page performance tests. Technical Skills Required Experience in building large scale batch and data pipelines with data processing frameworks in AWS cloud platform using PySpark (on EMR) & Glue ETL Deep experience in developing data processing data manipulation tasks using PySpark such as reading data from external sources merge data perform data enrichment and load in to target data destinations. Experience in deployment and operationalizing the code using CI/CD tools Bit bucket and Bamboo Strong AWS cloud computing experience. Extensive experience in Lambda S3 EMR Redshift Should have worked on Data Warehouse/Database technologies for at least 8 years. 7. Any AWS certification will be an added advantage. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
5.0 - 8.0 years
2 - 5 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_2168_JOB Date Opened 10/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AWS Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600002 Number of Positions 4 Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
5.0 - 8.0 years
2 - 6 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1668_JOB Date Opened 19/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Sr. AWS Developer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 AWS Lambda Glue Kafka/Kinesis RDBMS Oracle, MySQL, RedShift, PostgreSQL, Snowflake Gateway Cloudformation / Terraform Step Functions Cloudwatch Python Pyspark Job role & responsibilities: Looking for a Software Engineer/Senior Software engineer with hands on experience in ETL projects and extensive knowledge in building data processing systems with Python, pyspark and Cloud technologies(AWS). Experience in development in AWS Cloud (S3, Redshift, Aurora, Glue, Lambda, Hive, Kinesis, Spark, Hadoop/EMR) Required Skills: Amazon Kinesis, Amazon Aurora, Data Warehouse, SQL, AWS Lambda, Spark, AWS QuickSight Advanced Python Skills Data Engineering ETL and ELT Skills Experience of Cloud Platforms (AWS or GCP or Azure) Mandatory skills Datawarehouse, ETL, SQL, Python, AWS Lambda, Glue, AWS Redshift. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
3.0 - 8.0 years
15 - 22 Lacs
Gurugram
Work from Office
Data Engineer Exp : 4 years of Experience Years of Minimum Relevant : 3+ Years in data engineering Location- Gurgaon Role Summary: The Data Engineer will develop and maintain AWS-based data pipelines, ensuring optimal ingestion, transformation, and storage of clinical trial data. The role requires expertise in ETL, AWS Glue, Lambda functions, and Redshift optimization. Must have- AWS (Glue, Lambda, Redshift, Step Functions) Python, SQL, API-based ingestion Pyspark Redshift, SQL/PostgreSQL, Snowflake (optional) Redshift Query Optimization, Indexing IAM, Encryption, Row-Level Security
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Database Administrator to manage and optimize cloud-based databases for performance, availability, and security. Ideal for DBAs familiar with cloud-native tools and automation. Key Responsibilities: Provision and manage cloud-hosted databases (RDS, Cloud SQL, Cosmos DB) Monitor performance and tune queries and indexes Automate backups, scaling, and high availability configurations Ensure database security, auditing, and compliance Required Skills & Qualifications: Experience with cloud platforms (AWS, Azure, GCP) and DBaaS tools Proficiency in SQL and database optimization techniques Familiarity with failover strategies, monitoring, and replication Bonus: Knowledge of serverless databases and multi-region architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
4.0 - 9.0 years
12 - 22 Lacs
Hyderabad, Chennai
Work from Office
Interested can also apply with Sanjeevan Natarajan sanjeevan.natarajan@careernet.in Role & responsibilities Technical Leadership Lead a team of data engineers and developers; define technical strategy, best practices, and architecture for data platforms. End-to-End Solution Ownership Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Data Pipeline Strategy Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Data Governance & Quality Enforce data validation, lineage, and quality checks across the data lifecycle. Define standards for metadata, cataloging, and governance. Orchestration & Automation Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Cloud Cost & Performance Optimization Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Security & Compliance Define and enforce data security standards, IAM policies, and compliance with industry-specific regulatory frameworks. Collaboration & Stakeholder Engagement Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Migration Leadership Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentorship & Growth Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Preferred candidate profile Python , SQL , PySpark , Databricks , AWS (Mandatory) Leadership Experience in Data Engineering/Architecture Added Advantage: Experience in Life Sciences / Pharma
Posted 1 month ago
12.0 - 17.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role Application Lead Project Role Description Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills AWS Glue Good to have skills NA Minimum 12 year(s) of experience is required Educational Qualification 15 years full time education SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members. Roles & Responsibilities- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Ensure seamless communication among team members- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue- Strong understanding of cloud computing concepts- Experience in designing and implementing scalable applications- Knowledge of data integration and ETL processes- Hands-on experience with AWS services such as S3, Lambda, and Redshift Additional Information- The candidate should have a minimum of 12 years of experience in AWS Glue- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
15.0 - 25.0 years
17 - 27 Lacs
Mumbai
Work from Office
Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to help clients implement effective solutions. You will understand and translate customer needs into business and technology solutions, drive discussions and consult on transformation, the customer journey, functional/application designs, and ensure technology and business solutions represent business requirements. Your typical day will involve collaborating with clients, analyzing their needs, designing integration solutions, and ensuring successful implementation. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Should have Influencing and Advisory skills. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Collaborate with clients to understand their integration needs. Design and develop integration solutions using SnapLogic. Ensure successful implementation of integration solutions. Provide guidance and support to junior integration engineers. Professional & Technical Skills: Must To Have Skills:Proficiency in SnapLogic. Good To Have Skills:Experience with other integration platforms. Strong understanding of integration concepts and best practices. Experience in designing and implementing complex integration solutions. Knowledge of API management and web services. Familiarity with data transformation and mapping techniques. Additional Information: The candidate should have a minimum of 15 years of experience in SnapLogic. This position is based in Mumbai. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 month ago
12.0 - 17.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer Lead, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. A typical day involves working on data solutions and ETL processes. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead data architecture design. Implement data integration solutions. Optimize ETL processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling. Experience with SQL and database management. Knowledge of cloud platforms like AWS or Azure. Hands-on experience with data warehousing. Good To Have Skills: Experience with data visualization tools. Additional Information: The candidate should have a minimum of 12 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough