Home
Jobs

390 Glue Jobs - Page 15

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6 - 9 years

15 - 18 Lacs

Chennai, Tamilnadu

Work from Office

Naukri logo

Job Title: BI Data Analyst/Specialist Location: Bangalore, Karnataka Experience: 6 to 9 years Salary: 17.5 LPA Employment Type: Full-time Skills: My primary Skills:- Snowflake, Data Analysis, Building ETL pipeline, Control-M, Jenkins(CI/CD pipeline), GitHub, AWS(S3, Glue, Lambda) Secondary Skills:- Tableau, Postgres Job Summary: As a BI Data Analyst/Specialist , you will be responsible for analyzing data, building ETL pipelines, and ensuring the seamless integration and movement of data across platforms. You will work with stakeholders to gather data requirements and produce actionable insights, while ensuring data quality, scalability, and efficiency. You will also play a pivotal role in managing data infrastructure, automating workflows, and working with AWS services, Our team is looking for a talented BI Data Analyst/Specialist to join our team and help drive data initiatives using Snowflake, AWS, and other cutting-edge technologies. Key Responsibilities: Analyze data and generate actionable insights to support business decision-making processes. Design, build, and optimize ETL pipelines to move data from various sources into Snowflake and other databases. Collaborate with Data Engineers and Business Intelligence teams to understand data requirements and build solutions. Automate data workflows using tools like AWS Glue , Lambda , and Control-M for scheduling and orchestration. Leverage Jenkins for building and maintaining CI/CD pipelines for data-related processes. Manage version control using GitHub and ensure smooth deployment of data-related solutions. Work with Snowflake to create scalable, efficient data models and support querying and reporting activities. Monitor data pipelines to ensure they run smoothly and perform troubleshooting when issues arise. Use AWS S3 for data storage and ensure proper data governance practices are followed. Assist in the creation of dashboards and reports for internal teams and stakeholders using Tableau . Work with PostgreSQL and other relational databases to ensure data integrity and seamless integrations. Ensure compliance with data security and privacy standards, especially in cloud environments. Skills and Qualifications: 6 to 9 years of experience in a BI Data Analyst/Specialist role or similar position. Strong hands-on experience with Snowflake , AWS S3 , AWS Glue , Lambda , and other cloud technologies. Solid experience in building and maintaining ETL pipelines . Proficiency in scheduling and orchestrating workflows using Control-M . Experience with Jenkins for CI/CD pipeline management . Strong knowledge of data analysis techniques and reporting tools, especially Tableau for data visualization. Experience with PostgreSQL and other relational databases for data manipulation and querying. Expertise in SQL for querying and data manipulation. Ability to work with large datasets and ensure high-quality, accurate data. Good understanding of data warehousing concepts and business intelligence methodologies. Excellent problem-solving and communication skills, able to work with cross-functional teams. Nice to Have: Familiarity with real-time data processing frameworks such as Apache Kafka . Understanding of Data Governance and data privacy standards (GDPR, etc.). Knowledge of data modeling and database optimization techniques.

Posted 3 months ago

Apply

15 - 18 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Tech Solution Architecture Good to have skills : .NET Architecture, AWS Architecture, Google Cloud Platform Architecture Minimum 15 year(s) of experience is required Educational Qualification : AWS GCP or any cloud certification Key Responsibilities:oDesign Architecture for various components and validate toolsoParticipate in all Architectural meeting and analyze all technical requirements for applicationoPropose Solution with Cloud services to address the business enhancements oReview cloud services for any issues and recommend for solutionsoWork with Functional team and understand the requirements and accommodate in solution, high level and low level designoWork with Dev team to explain the design and ensure the implementation is aligned with designoEnsures Security, Performance and availability in the solution designoEnsure Accenture guidelines followed in all the deliverables Technical Experience:1. Knowledge of working with AWS services like Lambda, Glue, API Gateway, Dynamo DB, Redis, IAM, S3, CloudFront, cloud watch, CloudTrail etc2. Knowledge of working with GCP services like PubSub, Cloud Professional Attributes:1.Strong oral, written and presentation skills with all level of stakeholders2.Research, assess and adopt new technologies as required 3.Strive for quality of performance, usability, reliability, maintainability and extensibility. Qualification AWS GCP or any cloud certification

Posted 3 months ago

Apply

3 - 8 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : Any Graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Python. Your typical day will involve working with Oracle Procedural Language Extensions to SQL (PLSQL) and collaborating with cross-functional teams to deliver impactful data-driven solutions. Key Responsibilities Design and implementation of end to end with Python in AWS environment. Develop node.js and python code in AWS environment. Create an inspiring team environment with an open communication culture.(For Leads) Monitor team performance and report on metrics (For leads) Discover training needs and provide coaching (For leads) Architecting pilots and proofs-of-concept effort to spur innovation. Working in all stages of the development lifecycle Automation of manual data object creation and test cases (For Leads) Ask smart questions, collaborate, team up, take risks, and champion new ideas (For Leads) Job Description/Skills: Strong hands-on experience in solutioning AWS components. Experience in CI/CD and DevOps pipeline Experience in DevOps Tools and development model Develop node.js and python code in AWS environment. Experience working in AWS components (Step Function, Lambda, EventBridge, S3, SNS, SQS, RDS, Glue, IAM, VPC, Secret Manager, CloudWatch, DynamoDB) Experience in with different workflows of version control like git / bit bucket Experience in code coverage and fixing sonar issues Good knowledge of CI/CD pipelines using AWS/OpenShift Good to have front end UI experience with react JS / Angular Minimum 1 Experience in Agile methodology Deep understanding of database concepts and design for SQL (primarily) and NoSQL (secondarily) -- schema design, optimization, scalability, etc. Good experience with git software version control and good understanding of code branching strategies and organization for code reuse Additional Information: The candidate should have a minimum of 3 years of experience in Python and PLSQL. The ideal candidate will possess a strong educational background in computer science or a related field. This position is based at our Hyderabad office.Resource is willing to work in B shift. Qualifications Any Graduation

Posted 3 months ago

Apply

3 - 7 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : Any graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Python. Your typical day will involve working with Oracle Procedural Language Extensions to SQL (PLSQL) and collaborating with cross-functional teams to deliver impactful data-driven solutions. Key Responsibilities Design and implementation of end to end with Python in AWS environment. Develop node.js and python code in AWS environment. Create an inspiring team environment with an open communication culture.(For Leads) Monitor team performance and report on metrics (For leads) Discover training needs and provide coaching (For leads) Architecting pilots and proofs-of-concept effort to spur innovation. Working in all stages of the development lifecycle Automation of manual data object creation and test cases (For Leads) Ask smart questions, collaborate, team up, take risks, and champion new ideas (For Leads) Job Description/Skills: Strong hands-on experience in solutioning AWS components. Experience in CI/CD and DevOps pipeline Experience in DevOps Tools and development model Develop node.js and python code in AWS environment. Experience working in AWS components (Step Function, Lambda, EventBridge, S3, SNS, SQS, RDS, Glue, IAM, VPC, Secret Manager, CloudWatch, DynamoDB) Experience in with different workflows of version control like git / bit bucket Experience in code coverage and fixing sonar issues Good knowledge of CI/CD pipelines using AWS/OpenShift Good to have front end UI experience with react JS / Angular Minimum 1 Experience in Agile methodology Deep understanding of database concepts and design for SQL (primarily) and NoSQL (secondarily) -- schema design, optimization, scalability, etc. Good experience with git software version control and good understanding of code branching strategies and organization for code reuse Additional Information: The candidate should have a minimum of 3 years of experience in Python. The ideal candidate will possess a strong educational background in computer science or a related field. This position is based at our Hyderabad office.Resource is willing to work in B shift Qualifications Any graduation

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language) Minimum 3 year(s) of experience is required Educational Qualification : Any graduation Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Databricks Unified Data Analytics Platform. Your typical day will involve working with Python and utilizing your expertise in Databricks to deliver impactful data-driven solutions. The ideal candidate will work in a team environment that demands technical excellence, whose members are expected to hold each other accountable for the overall success of the end product. Focus for this team is on the delivery of innovative solutions to complex problems, but also with a mind to drive simplicity in refining and supporting of the solution by others Job Description & Responsibilities: Be accountable for delivery of business functionality. Work on the AWS cloud to migrate/re-engineer data and applications from on premise to cloud. Responsible for engineering solutions conformant to enterprise standards, architecture, and technologies Provide technical expertise through a hands-on approach, developing solutions that automate testing between systems. Perform peer code reviews, merge requests and production releases. Implement design/functionality using Agile principles. Proven track record of quality software development and an ability to innovate outside of traditional architecture/software patterns when needed. A desire to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Be entrepreneurial, business minded, ask smart questions, take risks, and champion new ideas. Take ownership and accountability. Experience Required: 3 to 5 years of experience in application program development Experience Desired: Knowledge and/or experience with healthcare information domains. Documented experience in a business intelligence or analytic development role on a variety of large-scale projects. Documented experience working with databases larger than 5TB and excellent data analysis skills. Experience with TDD/BDD Experience working with SPARK and real time analytic frameworks Education and Training Required: Bachelor's degree in Engineering, Computer Science Primary Skills: PYTHON, Databricks, TERADATA, SQL, UNIX, ETL, Data Structures, Looker, Tableau, GIT, Jenkins, RESTful & GraphQL APIs. AWS services such as Glue, EMR, Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM Additional Skills: Ability to rapidly prototype and storyboard/wireframe development as part of application design. Write referenceable and modular code. Willingness to continuously learn & share learnings with others. Ability to communicate design processes, ideas, and solutions clearly and effectively to teams and clients. Ability to manipulate and transform large datasets efficiently. Excellent troubleshooting skills to root cause complex issues Qualifications Any graduation

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Spring Boot Good to have skills : React.js Minimum 5 year(s) of experience is required Educational Qualification : Any graduation Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Angular. Your typical day will involve collaborating with cross-functional teams, developing and testing code, and ensuring the application meets the required specifications. Strong hands-on experience in solutioning AWS components. Experience in CI/CD and DevOps pipeline Experience in DevOps Tools and development model Develop node.js and python code in AWS environment. Experience working in AWS components (Step Function, Lambda, EventBridge, S3, SNS, SQS, RDS, Glue, IAM, VPC, Secret Manager, CloudWatch, DynamoDB) Experience in with different workflows of version control like git / bit bucket Experience in code coverage and fixing sonar issues Good knowledge of CI/CD pipelines using AWS/OpenShift Good to have front end UI experience with react JS / Angular Minimum 1 Experience in Agile methodology Deep understanding of database concepts and design for SQL (primarily) and NoSQL (secondarily) -- schema design, optimization, scalability, etc. Good experience with git software version control and good understanding of code branching strategies and organization for code reuse Additional Information: - The candidate should have a minimum of 5 years of experience in Angular. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality software solutions. - This position is based at our Hyderabad office. Resource is willing to work in B shift

Posted 3 months ago

Apply

5 - 7 years

5 - 9 Lacs

Nagpur

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Architecture Good to have skills : Python (Programming Language) Minimum 5 year(s) of experience is required Educational Qualification : Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using AWS Architecture. Your typical day will involve working with AWS services, developing and testing code, and collaborating with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities:- Drive automation and integrate with CI/CD tools for continuous validation. Drive mentality of building well architected applications for AWS Cloud Drive the mentality of quality being owned by the entire team. Identify code defects and work with other developers to address quality issues in product code. Finding bottlenecks and thresholds in existing code through the use of automation tools. Articulate clear business objectives aligned to technical specifications and work in an iterative agile pattern daily. Ownership over your work task, and are comfortable interacting with all levels of the team and raise challenges when necessary. Professional & Technical Skills: Core code production for back, middle and front end applications Deploying and developing AWS cloud applications and services end to end Operational triage of bugs, failed test cases and system failures Creating and optimizing infrastructure performance metrics Mapping user stories to detailed technical specifications Complete detailed peer code reviews Architecting pilots and proofs-of-concept effort to spur innovation Working in all stages of the development lifecycle Automation of manual data object creation and test cases Ask smart questions, collaborate, team up, take risks, and champion new ideas Extensive experience with AWS or other cloud technologies including Glue, Lambda, S3, IAM, VPC, EC2, Athena, Cloudwatch, Dynamo and RDS Understanding of the serverless mindset on architectural solutioning Strong Terraform IaaS experience. Experience with DevOps & CI/CD tools Jenkins, Cloudbees, Please Build, etc. Proficiency with OOP languages such as Python, Java, Scala, but Python preferred Proficiency working with large data stores and data sets Deep understanding of database concepts and design for SQL (primarily) and NoSQL (secondarily) -- schema design, optimization, scalability, etc. Solid experience with git software version control and good understanding of code branching strategies and organization for code reuse Qualification NA

Posted 3 months ago

Apply

4 - 7 years

12 - 18 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

We're hiring a Data Engineer with 5-8 years of experience in AWS (S3, Redshift, Glue, EMR, Lambda), Apache Spark, and SQL. Build scalable ETL/ELT pipelines & enable business insights through data engineering. JD: https://tinyurl.com/dataengineerblr Perks and benefits Annual bonus Life insurance Performance bonus

Posted 3 months ago

Apply

6 - 11 years

14 - 16 Lacs

Kolkata, Mumbai (All Areas)

Work from Office

Naukri logo

• AWS DWH tech stack with python • Create solution in python for creating reusable base classes using AWS libraries • End to end data pipeline in Pyspark,Athena End to end data pipeline using AWS Glue in python using boto3 DWH implementation

Posted 3 months ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Expertise in JupyterLab, SAS Studio, and R Studio for data science workflows. SAS programming (Base SAS, Advanced SAS, SAS Viya), R, and Python. Exp in managing and scaling JupyterHub, RStudio Server, and SAS Viya in cloud or on-prem environments

Posted 3 months ago

Apply

6 - 11 years

19 - 34 Lacs

Bengaluru, Hyderabad, Kolkata

Work from Office

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of an AWS Developer! We are looking for candidates who have a passion for cloud with knowledge of different cloud environments. Ideal candidates should have technical experience in AWS Platform Services – IAM Role & Policies, Glue, Lamba, EC2, S3, SNS, SQS, EKS, KMS, etc. This key role demands a highly motivated individual with a strong background in Computer Science/ Software Engineering. You are meticulous, thorough and possess excellent communication skills to engage with all levels of our stakeholders. A self-starter, you are up-to-speed with the latest developments in the tech world. Responsibilities Hands-On experience & good skills on AWS Platform Services – IAM Role & Policies, Glue, Lamba, EC2, S3, SNS, SQS, EKS, KMS, etc. Must have good working knowledge on Kubernetes & Dockers. Utilize AWS services such as Amazon Glue, Amazon S3, AWS Lambda, and others to optimize performance, reliability, and cost-effectiveness. Develop scripts, utilities, and automation tools to facilitate the migration process and ensure compatibility with AWS services. Implement best practices for security, scalability, and fault tolerance in AWS-based solutions. Experience in AWS Cost Analysis & thorough understanding to optimize AWS Cost. Must have good working knowledge on deployment templates like Terraform\Cloud formation. Ability to multi-task and manage various project elements simultaneously. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor’s Degree with experience in Information Technology. Must have experience in AWS Platform Services. Preferred Qualifications/ Skills Very good written and presentation / verbal communication skills with experience of customer interfacing role. In-depth requirement understanding skills with good analytical and problem-solving ability, interpersonal efficiency, and positive attitude. Experience in ML/ AI Experience in the telecommunication industry Experience with cloud providers (e.g., AWS, GCP) Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Mysore

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Madurai

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Visakhapatnam

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Coimbatore

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Vijayawada

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Pimpri-Chinchwad

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Gurgaon

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

4 - 8 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

4 to 7 year experience on AWS Data Engineering,2 end to end Data Engineering projects, ETL,Data Quality and Governance,Collaboration,Big Data Technologies, S3, Glue, EMR, Redshift, and Lambda.Proficiency in SQL and database design. Hadoop and Spark.

Posted 3 months ago

Apply

10 - 13 years

9 - 17 Lacs

Pune, Mumbai, Bengaluru

Work from Office

Naukri logo

Job Title: Senior AWS Consultant for a US Based MNC Organization Location: Pan India Location Employment Type: Permanent Work Mode - Remote Mode of interview - Virtual Interview ( 2 rounds ) Years of experience - 10 - 15 years Notice period - Immediate - 15 days We are hiring for a US Based MNC Organization for Pan India Location This is a fantastic opportunity for candidates with a strong background working as Senior AWS Consultant Job Responsibilities: Candidate with AWS background worked on AWS lambdas, S3, CDN, SQS, SNS, event bridge, API layer, for implementation along with Glue, RDS. Should be well versed and implemented couple of projects on AWS entirely with below highlighted AWS services and should be able to articulate them. Good knowledge with lambdas, S3, CDN(cloudfront), Amazon API layer, SQS, Step functions, Glue and RDS. AWS Expert who comes with AWS Glue , API Gateway , Integrations , SQS , Lambda. 10+ years' experience in development including designing and developing integration applications with JAVA/ J2EE web applications with different integration systems. Collaborate with cross functional teams to understand their integration and data requirements and translate these into technical requirements. Strong experience in setting up, configuring and integrating API gateways in AWS Experience in AWS Cloud is a must Experience in API framework, XML/JSON, REST & Data protection knowledge as applies it to software design, build, test & documentation and integration it with different systems. Utilize multiple data sources such as Oracle, MS SQL Server, Complex Flat Files, Delimited Files, and XML files for handling feeds. Unit and functional testing experience, for example with Jest, Testing Library or Cypress Software development lifecycle experience and experience working in an Agile methodology. Enhance or improve business processes via integration, or, as necessary, minimize the impact of integration on those processes How to Apply: If you have the experience and skills required for this role, send your profiles to amish.solanki@delta-ssi.net Please fill in the details mentioned below Total Work Experience: Relevant Work Experience: Current CTC: Expected CTC: Current Location : Notice period ( Negotiable to how many days ) : If Serving /Served Notice period ( Last working Day ) : Current Company : Current payroll Organization : Alternate No : Date of Birth : Reason for Job Change :

Posted 3 months ago

Apply

1 - 6 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: We are looking for a talented, motivated leader with experience in building Scalable Cloud Services, Infrastructure, and processes. As part of the IoT (Internet of Things) team you will be working on the next generation of IoT products. As a Business Intelligence Engineer (BIE) In This Role The ideal candidate will solve unique and complex problems at a rapid pace, utilizing the latest technologies to create solutions that are highly scalable. You will have deep expertise in gathering requirements and insights, mining large and diverse data sets, data visualization, writing complex SQL queries, building rapid prototype using Python/ R and generating insights that enable senior leaders to make critical business decisions.Key job responsibilities You will utilize your deep expertise in business analysis, metrics, reporting, and analytic tools/languages like SQL, Excel, and others, to translate data into meaningful insights through collaboration with scientists, software engineers, data engineers and business analysts. You will have end-to-end ownership of operational, financial, and technical aspects of the insights you are building for the business, and will play an integral role in strategic decision-making. Conduct deep dive analyses of business problems and formulate conclusions and recommendations to be presented to senior leadership Produce recommendations and insights that will help shape effective metric development and reporting for key stakeholders Simplify and automate reporting, audits and other data-driven activities Partner with Engineering teams to enhance data infrastructure, data availability, and broad access to customer insights To develop and drive best practices in reporting and analysis:data integrity, test design, analysis, validation, and documentation Learn new technology and techniques to meaningfully support product and process innovation BASIC QUALIFICATIONS At least 1+ years of experience using SQL to query data from databases/data warehouses/cloud data sources/etc. (e.g., Redshift, MySQL, PostgreSQL, MS SQL Server, BigQuery, etc.). Experience with data visualization using Tableau, Power BI, Quicksight, or similar tools. Bachelors degree in Statistics, Economics, Math, Finance, Engineering, Computer Science, Information Systems, or a related quantitative field. Ability to operate successfully and independently in a fast-paced environment. Comfort with ambiguity and eagerness to learn new skills. Knowledge of Cloud Services AWS, GCP and/or Azure is a must PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, Athena, Glue, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience with creating and building predictive/optimization tools that benefit the business and improve customer experience Experience articulating business questions and using quantitative techniques to drive insights for business. Experience in dealing with technical and non-technical senior level managers. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 3 months ago

Apply

8 - 13 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Role:Lead position with Primary skillsets on AWS services with experience on EC3, S3, Redshift, RDS, AWS Glue/EMR, Python , PySpark, SQL, Airflow, Visualization tools & Databricks. Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Ability to migrate the existing Talend ETL workflow into new ETL framework using AWS Glue/ EMR, PySpark and/or data pipeline using python. Build orchestration workflow using Airflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Connect with Customer to get the requirement and ensure the timely delivery. Qualifications: Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 8+ years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 8+ year of experience in Amazon Web Services like VPC, S3, EC3, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 8+ years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark, Talend and Airflow for orchestration. Minimum 8+ years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift and EMR. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into databases, Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in analytic visualization tools. Design of data solutions on Databricks including delta lake, data warehouse, data marts and other data solutions to support the analytics needs of the organization. Should be an individual contributor with experience in above mentioned technologies Should be able to lead the offshore team and can ensure on time delivery, code review and work management among the team members. Should have experience in customer communication.

Posted 3 months ago

Apply

3 - 6 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 months ago

Apply

8 - 13 years

20 - 30 Lacs

Gurgaon

Remote

Naukri logo

We are looking for ETL Architect || Immediate Joiner Remote work- 5 Working Days Duration: Permanent-Fulltime. We are seeking a highly skilled and experienced ETL (Extract, Transform, Load) Technical Architect with expertise in AWS (Amazon Web Services) and PySpark to join our team. As the ETL Technical Architect, you will play a crucial role in designing and implementing ETL solutions that support data integration, transformation, and loading processes. Your deep knowledge of AWS services and PySpark will be instrumental in building scalable and efficient data pipelines. Requirements:- Architect ETL solutions: Design end-to-end ETL solutions that align with business requirements, ensuring scalability, reliability, and performance. Data Modeling: Define and implement data models, schemas, and structures to support efficient data processing and storage. ETL Pipeline Development: Develop, optimize, and maintain ETL pipelines using PySpark, leveraging AWS Glue or other relevant AWS services. AWS Integration: Integrate ETL processes with various AWS services, such as S3, Redshift, Athena, and EMR, to create a comprehensive data ecosystem. Performance Optimization: Identify and resolve performance bottlenecks in ETL processes, optimizing data extraction, transformation, and loading for efficiency. Data Quality and Governance: Implement data quality checks and governance policies to ensure data accuracy and compliance with industry standards and regulations. Monitoring and Troubleshooting: Set up monitoring and alerting systems to proactively identify and address issues within the ETL pipelines. Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand their data requirements and ensure successful data delivery. Documentation: Create and maintain technical documentation, including architecture diagrams, data flow diagrams, and standard operating procedures. Stay Current: Keep up-to-date with the latest ETL, AWS, and PySpark developments and best practices, and apply them to enhance existing processes. Qualifications:- 1) Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 2) Proven experience as an ETL Technical Architect with a strong focus on AWS and PySpark. 3) In-depth knowledge of AWS services, including but not limited to S3, Redshift, Athena, EMR, and Glue. 4) Proficiency in PySpark and related Big Data technologies for ETL processing. 5) Strong SQL skills for data manipulation and querying. 6) Familiarity with data warehousing concepts and dimensional modeling. 7) Experience with data governance, data quality, and data security practices. 8) Excellent problem-solving skills and attention to detail. 9) Strong communication and collaboration skills to work effectively with cross-functional teams. 10) AWS certifications (e.g., AWS Certified Data Analytics, AWS Certified Big Data) are a plus.

Posted 3 months ago

Apply

Exploring Glue Jobs in India

In recent years, the demand for professionals with expertise in glue technologies has been on the rise in India. Glue jobs involve working with tools and platforms that help connect various systems and applications together seamlessly. This article aims to provide an overview of the glue job market in India, including top hiring locations, average salary ranges, career progression, related skills, and interview questions for aspiring job seekers.

Top Hiring Locations in India

Here are 5 major cities in India actively hiring for glue roles: 1. Bangalore 2. Pune 3. Hyderabad 4. Chennai 5. Mumbai

Average Salary Range

The estimated salary range for glue professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with several years of experience can earn between INR 12-18 lakhs per annum.

Career Path

In the field of glue technologies, a typical career progression may include roles such as: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

Apart from expertise in glue technologies, professionals in this field are often expected to have or develop skills in: - Data integration - ETL (Extract, Transform, Load) processes - Database management - Programming languages (e.g., Python, Java)

Interview Questions

Here are 25 interview questions for glue roles: - What is Glue in the context of data integration? (basic) - Explain the difference between ETL and ELT. (basic) - How would you handle data quality issues in a glue job? (medium) - Can you explain how Glue works with Apache Spark? (medium) - What is the significance of schema evolution in Glue? (medium) - How do you optimize Glue jobs for performance? (medium) - Describe a scenario where you had to troubleshoot a failed Glue job. (medium) - What is a bookmark in Glue and how is it used? (medium) - How does Glue handle schema inference? (medium) - Have you worked with AWS Glue DataBrew? If so, explain your experience. (medium) - Explain how Glue handles schema evolution. (advanced) - How does Glue support job bookmarks for incremental processing? (advanced) - What are the differences between Glue ETL and Glue DataBrew? (advanced) - How do you handle nested JSON structures in Glue transformations? (advanced) - Explain a complex Glue job you have designed and implemented. (advanced) - How does Glue handle dynamic frame operations? (advanced) - What is the role of a Glue DynamicFrame in data transformation? (advanced) - How do you handle schema changes in Glue jobs? (advanced) - Explain how Glue can be integrated with other AWS services. (advanced) - What are the limitations of Glue that you have encountered in your projects? (advanced) - How do you monitor and debug Glue jobs in production environments? (advanced) - Describe your experience with Glue job scheduling and orchestration. (advanced) - How do you ensure security in Glue jobs that handle sensitive data? (advanced) - Explain the concept of lazy evaluation in Glue. (advanced) - How do you handle dependencies between Glue jobs in a workflow? (advanced)

Closing Remark

As you prepare for interviews and explore opportunities in the glue job market in India, remember to showcase your expertise in glue technologies, related skills, and problem-solving abilities. With the right preparation and confidence, you can land a rewarding career in this dynamic and growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies