Jobs
Interviews

567 Glue Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Join us as a Data Engineer, PySpark, AWS Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, You'll need at least eight years of experience working with Python, PySpark and SQL You'll also need experience in AWS architecture using EMR, EC2, S3, Lambda and Glue You'll also need experience in Apache Airflow, Anaconda and Sagemaker, Youll Also Need Experience of using programming languages alongside knowledge of data and software engineering fundamentals Experience with Performance optimization and tuning Good knowledge of modern code development practices Great communication skills with the ability to proactively engage with a range of stakeholders Show

Posted 4 days ago

Apply

2.0 - 4.0 years

5 - 15 Lacs

Bengaluru, Karnataka, India

On-site

As a Software Engineer , you will: Analyze business and functional requirements to design and implement scalable data integration solutions. Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD). Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows. Conduct peer code reviews and suggest improvements for reliability and performance. Prepare and execute comprehensive unit test cases and support system/integration testing. Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases. Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability. Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs. Collaborate with cross-functional teams to support UAT, deployments, and production issues. Requirements: You are a fit for this position if your background includes: 2-4 years of strong hands-on experience with Informatica PowerCenter . Proficiency in developing and optimizing ETL mappings , workflows, and sessions. Solid experience with performance tuning techniques and best practices in ETL processes. Hands-on experience with Snowflake for data loading, SQL transformations, and optimization. Strong skills in Unix/Linux scripting for job automation. Experience in converting HLDs into LLDs and defining unit test cases . Knowledge of data warehousing concepts , data modeling , and data quality frameworks . Good to Have: Knowledge of Salesforce data model and integration (via Informatica or API-based solutions). Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. Familiarity with relational databases such as SQL Server and PostgreSQL . Experience with job schedulers like Control-M , ESP , or equivalent. Agile methodology experience and tools such as JIRA , Confluence , and Git . Knowledge of DBT (Data Build Tool) for data transformation and orchestration. Experience with Python scripting for data manipulation, automation, or integration tasks.

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are a highly skilled Senior Python Developer who will be responsible for designing and developing scalable and efficient software applications using Python and AWS services. Your role involves collaborating with cross-functional teams to ensure high-quality software applications. Key Responsibilities: Design and develop scalable and efficient software applications using Python and AWS services, including the development of RESTful APIs using Flask or Django, data processing and analytics pipelines using AWS services (e.g. S3, Lambda, Glue), and cloud-based applications using AWS services (e.g. EC2, RDS, Elastic Beanstalk). Collaborate with cross-functional teams, including development teams to ensure testability and feasibility of requirements, Quality Assurance teams to ensure alignment with testing methodologies and standards, and Product Management teams to ensure alignment with product vision and requirements. Develop and maintain AWS services, including S3 bucket management and data processing, Lambda function development and deployment, and Glue data catalog management and ETL development. Participate in testing activities, including unit testing using Python testing frameworks (e.g. unittest, pytest), integration testing using AWS services (e.g. S3, Lambda), and end-to-end testing using AWS services (e.g. API Gateway, Elastic Beanstalk). Collaborate with development teams to ensure timely and accurate defect fixes, including defect tracking and prioritization, defect reproduction and debugging, and defect verification and closure. Stay up-to-date with the latest AWS services and cloud-based technologies, and apply this knowledge to improve software applications and efficiency. Requirements: 5+ years of experience in software development, with a strong understanding of Python and AWS services. Strong understanding of testing frameworks and tools, including Python testing frameworks (e.g. unittest, pytest) and AWS services (e.g. S3, Lambda, Glue). Experience with cloud-based architectures and AWS services, including EC2, RDS, Elastic Beanstalk, S3, Lambda, and Glue. Strong problem-solving skills, with the ability to troubleshoot and debug complex issues. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Bachelor's degree in Computer Science or related field, or equivalent experience. Nice to Have: Experience with Agile development methodologies and Scrum frameworks. Knowledge of containerization using Docker. Familiarity with DevOps tools, such as Jenkins or GitLab. Certification in AWS services or related technologies. Experience with security and compliance frameworks, such as HIPAA or PCI-DSS. What We Offer: Competitive salary and benefits package. Opportunities for career growth and professional development. Collaborative and dynamic work environment. Flexible working hours and remote work options. Access to the latest technologies and tools. Recognition and rewards for outstanding performance. AWS Services Experience: Experience with AWS services, including S3, Lambda, Glue, EC2, RDS, Elastic Beanstalk, API Gateway, and CloudFormation. Experience with AWS SDKs and tools, including Boto3, AWS CLI, and AWS SDKs for Python. Experience with AWS best practices and security guidelines, including IAM roles and permissions, VPC and subnet configuration, security groups and network ACLs, and data encryption and access control.,

Posted 4 days ago

Apply

2.0 - 3.0 years

5 - 15 Lacs

Hyderabad, Telangana, India

On-site

About the Role: In this role as a Data Engineer,you will: Design, develop, optimize, and automate data pipelines that blend and transform data across different sources to help drive business intelligence, data science, and self-service data solutions. Work closely with data scientists and data visualization teams to understand data requirements to ensure the availability of high-quality data for analytics, modelling, and reporting. Build pipelines that source, transform, and load data that s both structured and unstructured keeping in mind data security and access controls. Explore large volumes of data with curiosity and conviction. Contribute to the strategy and architecture of data management systems and solutions. Proactively troubleshoot and resolve data-related and performance bottlenecks in a timely manner. Be open to learning and working on emerging technologies in the data engineering, data science and cloud computing space. Have the curiosity to interrogate data, conduct independent research, utilize various techniques, and tackle ambiguous problems. Shift Timings: 12 PM to 9 PM (IST) Work from office for 2 days in a week (Mandatory) About You You re a fit for the role of Data Engineer,ifyour background includes: Must have at least 4-6 years of total work experience with at least 2+ years in data engineering or analytics domains. Graduates in data analytics, data science, computer science, software engineering or other data centric disciplines. SQL Proficiency a must. Experience with data pipeline and transformation tools such as dbt, Glue, FiveTran, Alteryx or similar solutions. Experience using cloud-based data warehouse solutions such as Snowflake, Redshift, Azure. Experience with orchestration tools like Airflow or Dagster. Preferred experience using Amazon Web Services (S3, Glue, Athena, Quick sight). Data modelling knowledge of various schemas like snowflake and star. Has built data pipelines and other custom automated solutions to speed the ingestion, analysis, and visualization of large volumes of data. Knowledge building ETL workflows, database design, and query optimization. Has experience of a scripting language like Python. Works well within a team and collaborates with colleagues across domains and geographies. Excellent oral, written, and visual communication skills. Has a demonstrable ability to assimilate new information thoroughly and quickly. Strong logical and scientific approach to problem-solving. Can articulate complex results in a simple and concise manner to all levels within the organization.

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AWS Senior Data Engineer at our organization, you will be responsible for working with various technologies and tools to support the data engineering activities. Your primary tasks will include utilizing SQL for data querying and manipulation, developing data processing pipelines using Pyspark, and integrating data from API endpoints. Additionally, you will be expected to work with AWS services such as Glue for ETL processes, S3 for data storage, Redshift for data warehousing, Step Functions for workflow automation, Lambda for serverless computing, Cloudwatch for monitoring, and AppFlow for data integration. You should have experience with Cloud formation and administrative roles, as well as knowledge of SDLF & OF frameworks for data lifecycle management. Understanding S3 ingestion patterns and version control using Git is essential for this role. Exposure to tools like Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector will be beneficial in supporting your data engineering tasks effectively. Your role will involve collaborating with cross-functional teams to ensure the successful implementation of data solutions within the AWS environment.,

Posted 4 days ago

Apply

6.0 - 11.0 years

16 - 31 Lacs

Noida, Pune, Gurugram

Hybrid

Role & responsibilities AWS data developers with 6-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Banking / Financial / Payment Gateway or Similar experience is preferred.

Posted 5 days ago

Apply

4.0 - 9.0 years

14 - 19 Lacs

Chennai

Work from Office

Embark your transformative journey as Data Architect Vice president based out of Chennai You will design, develop, and implement solutions to solve complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency The role of the data architect is to define the appropriate data architecture with regards to the data products The role will involve implementing the data products principles and standards defined & agreed whilst engaging with the BUK CDA and CDO teams The requirement is to translate business / use case requirements into logical and physical data models which serve as the basis for data engineers to build the data products This includes working with business teams to capture their requirements, translating these into data models while considering performance implications Role will include testing the models with data engineers and continuously monitoring and optimizing the performance of these models Team has to engages with the CDA team to design data product solutions covering data architecture, platform design and integration patterns and ensuring that they are in line with policy and technical data standards They have to defines appropriate data pipeline and design patterns that orchestrates data from source to target in a variety of architectures ,designs and owns enterprise solution architecture roadmap for data products ,collaborates with the technical product lead on data governance requirements including data ownership of data assets and data quality lineage and standards , Partner with business stakeholders to understand their data needs and their desired functionality for the data product Translate these requirements into clear data modelling specifications They have to designs and implements high-quality data models that optimize data access, storage, and analysis ,creates comprehensive and well-maintained documentation of the data models, including entity relationship diagrams, data dictionaries, and usage guidelines collaborates with data engineers to test and validate the data models ,Obtains sign-off from the DPL, DPA and the Technical Product Lead on the logical and physical data models They also have continuously monitor and optimize the performance of the models to ensure efficient data retrieval and processing and collaborates with data engineers to translate into physical data models and throughout the development lifecycle , To be successful as a Senior Engineering and Design Lead, you should have experience with: Cloud platform expertise (AWS)-)AWS Data services proficiency with AWS data related services such as S3, Redshift, Athena, Glue, Lambda,Cloud Architecture Design Designing cloud-native data architectures, optimizing cost and performance on AWS,Data modelling and architecture, Big data technologies -Hadoop ,Data warehousing & Analytics such as -Teradata and Snowflake processes, SQL/scripting, data,Governance and Quality, They should ability to engage with business stakeholders, tech teams and data engineers to define requirements, align data strategies and deliver high value solutions And required proven experience leading cross-functional teams to execute complex data architectures, Some Other Highly Valued Skills May Include Advanced cloud services familiarity with other services and exposure to multi-cloud or hybrid cloud architectures, Data orchestration and automation, Performance tuning and Optimization, Data Visualization, You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills, This role is for Chennai Location, Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency, Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling This includes identification and implementation of the technologies and platforms, Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Banks resiliency expectations, Solutions that appropriately balance risks and controls to deliver the agreed business and technology value, Adoption of standardised solutions where they fit If no standard solutions fit, feed into their ongoing evolution where appropriate, Fault finding and performance issues support to operational support teams, leveraging available tooling, Solution design impact assessment in terms of risk, capacity and cost impact, inc estimation of project change and ongoing run costs, Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes, Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures, If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements, If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others, OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions, Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment, Manage and mitigate risks through assessment, in support of the control and governance agenda, Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does, Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business, Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies, Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions, Adopt and include the outcomes of extensive research in problem solving processes, Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes, All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave, Show more Show less

Posted 5 days ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Working with Us Challenging Meaningful Life-changing Those aren't words that are usually associated with a job But working at Bristol Myers Squibb is anything but usual Here, uniquely interesting work happens every day, in every department From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams Take your career farther than you thought possible, Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives Read more: careers bms /working-with-us , Summary As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data, Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organization's data needs, Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion, Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy, Serves as the Subject Matter Expert on Data & Analytics Solutions, Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed, In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem, Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements, Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site, Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion, If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway You could be one step away from work that will transform your life and career, Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science?, every BMS employee plays an integral role in work that goes far beyond ordinary Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues, On-site Protocol Responsibilities BMS has an occupancy structure that determines where an employee is required to conduct their work This structure includes site-essential, site-by-design, field-based and remote-by-design jobs The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function, BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms Visit careers bms / eeo -accessibility to access our complete Equal Employment Opportunity statement, BMS cares about your well-being and the well-being of our staff, customers, patients, and communities As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters, BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area, If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers bms /california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations, Show

Posted 5 days ago

Apply

5.0 - 9.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Planning and achieving sales targets both in Retail market.Travelling entire state to develop new markets.Develop a business strategy with positioning.Planning and achieving Primary &Secondary Sales Targets.Evaluating and recommending new distributor

Posted 5 days ago

Apply

2.0 - 7.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 1+ year of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field. 2+ years of academic or work experience with Programming Language such as C, C++, Java, Python, etc. General Summary Preferred Qualifications 3+ years of experience as a Data Engineer or in a similar role Experience with data modeling, data warehousing, and building ETL pipelines Solid working experience with Python, AWS analytical technologies and related resources (Glue, Athena, QuickSight, SageMaker, etc.,) Experience with Big Data tools , platforms and architecture with solid working experience with SQL Experience working in a very large data warehousing environment, Distributed System. Solid understanding on various data exchange formats and complexities Industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets Strong data visualization skills Basic understanding of Machine Learning; Prior experience in ML Engineering a plus Ability to manage on-premises data and make it inter-operate with AWS based pipelines Ability to interface with Wireless Systems/SW engineers and understand the Wireless ML domain; Prior experience in Wireless (5G) domain a plus Education Bachelor's degree in computer science, engineering, mathematics, or a related technical discipline Preferred QualificationsMasters in CS/ECE with a Data Science / ML Specialization Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 3+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field OR PhD in Engineering, Information Systems, Computer Science, or related field. 3+ years of experience with Programming Language such as C, C++, Java, Python, etc. Develops, creates, and modifies general computer applications software or specialized utility programs. Analyzes user needs and develops software solutions. Designs software or customizes software for client use with the aim of optimizing operational efficiency. May analyze and design databases within an application area, working individually or coordinating database development as part of a team. Modifies existing software to correct errors, allow it to adapt to new hardware, or to improve its performance. Analyzes user needs and software requirements to determine feasibility of design within time and cost constraints. Confers with systems analysts, engineers, programmers and others to design system and to obtain information on project limitations and capabilities, performance requirements and interfaces. Stores, retrieves, and manipulates data for analysis of system capabilities and requirements. Designs, develops, and modifies software systems, using scientific analysis and mathematical models to predict and measure outcome and consequences of design. Principal Duties and Responsibilities: Completes assigned coding tasks to specifications on time without significant errors or bugs. Adapts to changes and setbacks in order to manage pressure and meet deadlines. Collaborates with others inside project team to accomplish project objectives. Communicates with project lead to provide status and information about impending obstacles. Quickly resolves complex software issues and bugs. Gathers, integrates, and interprets information specific to a module or sub-block of code from a variety of sources in order to troubleshoot issues and find solutions. Seeks others' opinions and shares own opinions with others about ways in which a problem can be addressed differently. Participates in technical conversations with tech leads/managers. Anticipates and communicates issues with project team to maintain open communication. Makes decisions based on incomplete or changing specifications and obtains adequate resources needed to complete assigned tasks. Prioritizes project deadlines and deliverables with minimal supervision. Resolves straightforward technical issues and escalates more complex technical issues to an appropriate party (e.g., project lead, colleagues). Writes readable code for large features or significant bug fixes to support collaboration with other engineers. Determines which work tasks are most important for self and junior engineers, stays focused, and deals with setbacks in a timely manner. Unit tests own code to verify the stability and functionality of a feature. Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 5 days ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Job Title :AWS Data Engineer Location :Bangalore Notice Period : Immediate to 60 Days Preferred. Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Databricks. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. . - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 5-8 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS) is a plus. - Familiarity with Spark Streaming is a bonus.

Posted 5 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Amazon Web Services (AWS) Good to have skills : Python (Programming Language)Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Strong experience in developing and supporting AWS applications Hands on experience in Python, Gitops, Gitlab, AWS Services such as API Gateway, Cloud Watch, Lambda, S3, SNS, SQS, Glue, Redshift, etc. Hands on experience of IAC using Terraform Expert level scripting experience Python Experience deploying/maintaining CI/CD pipelines (Jenkins) Experience working as part of a development team, support operational requirements Adhere to coding standards, procedures and techniques utilizing programming skills and contribute to the technical code base including any required documentation Experience working in fast passed Agile work environment Experience in closely working with on-site resources, work in take and work delivery Strong communication and collaborative skills Having AWS Certifications is a plusResponsibilitiesProvide end-to-end application development, testing and support of AWS and On-Prem applications Write clean and well-designed reusable code Execute changes in CI/CD pipeline and deploy changes to different environments Ensure availability, performance and resiliency of applications Closely work with on-site resources, understand the work intake, provide status updates and deliver the work Additional Information:- The candidate should have a minimum of 3 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 20.0 years

45 - 55 Lacs

Pune, Bengaluru

Work from Office

Job Description for a Senior Solution Architect – Data & Cloud Job Title: Senior Solution Architect – Data & Cloud Experience: 12+ Years Location: Hybrid / Remote Practice: Migration Works Employment Type: Full-time About Company: We are a data and analytics firm that provides the strategies, tools, capability and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake, Starburst and Amazon Web Services, and has been named Tableau partner of the year multiple times. Headquartered in NYC, the company has 450 employees across offices in the U.S., Canada, India and Singapore and specializes in financial services. USEReady’s deep analytics expertise, unique player/coach approach and focus on fast results makes the company a perfect partner for a cloud-first, digital world. About the Role: We are looking for a highly experienced Senior Solution Architect to join our Migration Works practice, specializing in modern data platforms and visualization tools. The ideal candidate will bring deep technical expertise in Tableau, Power BI, AWS, and Snowflake, along with strong client-facing skills and the ability to design scalable, high-impact data solutions. You will be at the forefront of driving our AI driven migration and modernization initiatives, working closely with customers to understand their business needs and guiding delivery teams to success. Key Responsibilities: Solution Design & Architecture Lead the end-to-end design of cloud-native data architecture using AWS, Snowflake, and Azure stack. Translate complex business requirements into scalable and efficient technical solutions. Architect modernization strategies for legacy BI systems to cloud-native platforms. Client Engagement Conduct technical discussions with enterprise clients and stakeholders to assess needs and define roadmap. Act as a trusted advisor during pre-sales and delivery phases, showcasing technical leadership and consultative approach. Migration & Modernization Design frameworks for data platform migration (from on-premise to cloud), data warehousing, and analytics transformation. Support estimation, planning, and scoping of migration projects. Team Leadership & Delivery Oversight Guide and mentor delivery teams across geographies, ensuring solution quality and alignment to client goals. Support delivery by providing architectural oversight and resolving design bottlenecks. Conduct technical reviews, define best practices, and uplift the team’s capabilities. Required Skills & Experience: 15+ years of progressive experience in data and analytics, with at least 5 years in solution architecture roles. Strong hands-on expertise in: Tableau And Power BI – dashboard design, visualization architecture, and migration from legacy BI tools. AWS – S3, Redshift, Glue, Lambda, and data pipeline components. Snowflake – Architecture, Snowconvert, data modeling, security, and performance optimization. Experience in migrating legacy platforms (e.g., Cognos, BO, Qlik) to modern BI/Cloud-native stacks like Tableau and Power BI. Proven ability to interface with senior client stakeholders, understand business problems, and propose architectural solutions. Strong leadership, communication, and mentoring skills. Familiarity with data governance, security, and compliance in cloud environments. Preferred Qualifications: AWS/Snowflake certifications are a strong plus. Exposure to data catalog, lineage tools, and metadata management. Knowledge of ETL/ELT tools such as Talend, Informatica, or dbt. Prior experience working in consulting or fast-paced client services environments. What We Offer: Opportunity to work on cutting-edge AI led cloud and data migration projects. A collaborative and high-growth environment with room to shape future strategy. Access to learning programs, certifications, and technical leadership exposure.

Posted 5 days ago

Apply

8.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Roles and Responsibilities:- You must be an experienced Azure Cloud engineer with 8-10 years of Experience as part of AI engineering team to develop, implement, optimize, and maintain cloud-based solutions. You will be responsible for deploying and debugging cloud stacks, adopting and implementing best cloud practices, and ensuring the security of the cloud infrastructure. Responsibilities As part of project scrum team member design and implement the most optimal cloud-based solutions for the company Ensure application performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Modifying and improving existing systems. Educating teams on the implementation of new cloud technologies and initiatives. Ensuring efficient functioning of data storage and processing functions in accordance with company security policies and best practices in cloud security. Identifying, analyzing, and resolving infrastructure vulnerabilities and application deployment issues. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Regularly reviewing existing systems and making recommendations for improvements. Qualifications Degree in computer science or a similar field. Eight or more years of experience in architecting, designing, developing, and implementing cloud solutions in Azure or AWS Understanding of core cloud concepts like Infra as code, IaaS, PaaS and SaaS . Strong proficiency in Python and experience with REST API development . Design and implement scalable, secure, and efficient cloud-based solutions using Azure services. Develop and maintain RESTful APIs to support various applications. Technologies: Python, Terraform, Azure app services, Functions, App Insights, ADF in AZURE Similar Technology stack for AWS like ECS, Lambda, S3, Glue jobs etc Developing and maintaining continuous integration and continuous deployment pipelines Jenkins Groovy scripts. Developing containerized solutions and orchestration (Docker, Kubernetes, App Service or ECS) Experience of server less architecture, cloud computing, cloud native application and scalability etc Collaborate with cross-functional teams to define, design, and ship new features. Optimize applications for maximum speed and scalability. Implement robust security measures and ensure compliance with industry standards. Monitor and troubleshoot application performance and resolve any issues. Participate in code reviews and contribute to the continuous improvement of the development process. Development experience with configuration management tools (Terraform, Ansible, Arm Templates). Relevant certification of Azure/AWS preferred. Troubleshooting and analytical skills. Knowledge of AI & ML technologies, as well as ML model management is a plus.

Posted 5 days ago

Apply

8.0 - 12.0 years

15 - 25 Lacs

Gurugram, Delhi / NCR

Work from Office

Skills Requirements: Must have - Python, Pytest, SQL, ETL Automation, AWS, Data warehousing Good to have - Java, Selenium, API Automation, Rest Assured, Postman Proficient in Automation testing tools such as Selenium or Appium • Knowledge of scripting languages such as Python or JavaScript • Experience with Test Automation frameworks and best practices • Familiarity with Agile testing methodologies • Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. • Must understand the company's long-term vision and align with it. • Should be open to new ideas and be willing to learn and develop new skills. • Should also be able to work well under pressure and manage multiple tasks and priorities.

Posted 6 days ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Company Description Epsilon is an all-encompassing global marketing innovator, supporting 15 of the top 20 global brands. We provide unrivaled data intelligence and customer insights, world-class technology including loyalty, email and CRM platforms and data-driven creative, activation and execution. Epsilon's digital media arm, Conversant, is a leader in personalized digital advertising and insights through its proprietary technology and trove of consumer marketing data, delivering digital marketing with unprecedented scale, accuracy and reach through personalized media programs and through CJ Affiliate by Conversant, one of the world's largest affiliate marketing networks. Together, we bring personalized marketing to consumers across offline and online channels, at moments of interest, that help drive business growth for brands. Recognized by Ad Age as the #1 World's Largest CRM/Direct Marketing Agency Network, #1 Largest U.S. Agency from All Disciplines, #1 Largest U.S. CRM/Direct Marketing Agency Network and #1 Largest U.S. Mobile Marketing Agency, Epsilon employs over 8,000 associates in 70 offices worldwide. Epsilon is part of Alliance Data, a Fortune 500's and Fortune 100 Best Places to Work For a company. For more information, visit www.epsilon.com and follow us on Twitter @EpsilonMktg. Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Why we are looking for you We are looking for Senior Software Engineer to work on groundbreaking multichannel SaaS Digital Marketing Platform that focuses on uniquely identify the customer's patterns, effectively interact with them across channels and achieve a positive return on marketing investment (ROMI). The platform helps consolidate and integrates the features and functionality typically found in stand-alone services and channel-specific messaging platforms to give marketers a tightly integrated, easily orchestrated, insights-driven, cross channel marketing capability. Primary role of the Senior Software Engineer is to envision and build internet scale services on Cloud using Java and distributed technologies with 60-40 involvement in backend development with Java and frontend development using Angular. Responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions What you will enjoy in this role Tech Stack: Our integrated suite of modular products is designed to help deliver personalized experiences and drive meaningful outcomes. Our tech stack caters to a fusion of data and technology with SaaS offerings developed as a Cloud-first approach. Here, a solid understanding of software security practices including user authentication and authorization and being data-savvy would be key. You should also come with the ability to leverage best practices in design patterns, and design algorithms for software development that focus on high quality and agility. You must also have a good understanding of Agile Methodologies like SCRUM. You can refer this article also. What you will do Be responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions Qualifications Bachelor's degree or equivalent in computer science 6+ years of experience in Java/Angular/SQL/ AWS/Microservices Preferred knowledge/experience in the following technologies 2 + years of UI Technologies like Angular 2 or > 1 + year of experience in Cloud computing like AWS or Azure or GCP or PCF or OCI Experience in following Tools: Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence Can develop SOA services and good knowledge of REST API and Micro service architectures Solid knowledge of web architectural and design patterns Understands software security practices including user authentication and authorization, data validation and an understanding of common DOS and SQL injection techniques. Familiar with profiling, code coverage, logging, common IDE's and other development tools. Familiar with Agile Methodologies SCRUM and Strong communication skills (verbal and written) Ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks Ability to identify non-obvious solutions to complex problems

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data and Solution Architect at our company, you will play a crucial role in participating in requirements definition, analysis, and designing logical and physical data models for various data models such as Dimensional Data Model, NoSQL, or Graph Data Model. You will lead data discovery discussions with the Business in Joint Application Design (JAD) sessions and translate business requirements into logical and physical data modeling solutions. It will be your responsibility to conduct data model reviews with project team members and capture technical metadata using data modeling tools. Your expertise will be essential in ensuring that the database designs efficiently support Business Intelligence (BI) and end-user requirements. You will collaborate closely with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation. Additionally, you will work with Data Architects for data model management, documentation, and version control. Staying updated with industry trends and standards will be crucial in driving continual improvement and enhancement of existing systems. To excel in this role, you must possess strong data analysis and data profiling skills. Your experience in conceptual, logical, and physical data modeling for Very Large Database (VLDB) Data Warehouse and Graph DB will be highly valuable. Hands-on experience with modeling tools like ERWIN or other industry-standard tools is required. Proficiency in both normalized and dimensional model disciplines and techniques is essential. A minimum of 3 years" experience in Oracle Database along with hands-on experience in Oracle SQL, PL/SQL, or Cypher is expected. Exposure to tools such as Databricks Spark, Delta Technologies, Informatica ETL, and other industry-leading tools will be beneficial. Good knowledge or experience with AWS Redshift and Graph DB design and management is desired. Working knowledge of AWS Cloud technologies, particularly on services like VPC, EC2, S3, DMS, and Glue, will be advantageous. You should hold a Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience). Excellent verbal and written communication skills are necessary, including the ability to describe complex technical concepts in relatable terms. Your ability to manage and prioritize multiple workstreams confidently and make decisions about prioritization will be crucial. A data-driven mentality, self-motivation, responsibility, conscientiousness, and detail-oriented approach are highly valued. In terms of education and experience, a Bachelor's degree in Computer Science, Engineering, or relevant fields along with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions is required. You should have at least 3 years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB. Possessing AWS Solutions Architect Professional Level certifications will be advantageous. Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems is preferred. If you are someone who thrives in a dynamic environment and enjoys collaborating with enthusiastic individuals, this role is perfect for you. Join our team and be a part of our exciting journey towards innovation and excellence!,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Databricks on AWS and PySpark Engineer being sought to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and optimizing data processing workflows with PySpark. Collaboration with data scientists and analysts to develop data models and ensure data quality, security, and compliance with industry standards will also be a key responsibility. Your main tasks will include troubleshooting data pipeline issues, optimizing performance, and staying updated on industry trends and emerging data engineering technologies. You should have at least 3 years of experience in data engineering with a focus on Databricks on AWS and PySpark, possess strong expertise in PySpark and Databricks for data processing, modeling, and warehousing, and have hands-on experience with AWS services like S3, Glue, and IAM. Your proficiency in data engineering principles, data governance, and data security is essential, along with experience in managing data processing workflows and data pipelines. Strong problem-solving skills, attention to detail, effective communication, and collaboration abilities are key soft skills required for this role, as well as the capability to work in a fast-paced and dynamic environment while adapting to changing requirements and priorities.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You may be assessed on key critical skills relevant for success in the role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology, as well as job-specific skill sets. To be successful as a Cloud Data Engineer, you should have experience with: - Experience on AWS Cloud technology for data processing and a good understanding of AWS architecture. - Experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. - Experience with Storage and container services like ECS, S3, DynamoDB, RDS. - Experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. - Experience with Analytics services such as Glue, Athena, Crawler, Lake Formation, Redshift. - Experience with Solution delivery for data processing components in larger End to End projects. Desirable skill sets/good to have: - AWS Certified professional. - Experience in Data Processing on Databricks and unity catalog. - Ability to drive projects technically with right first deliveries within schedule and budget. - Ability to collaborate across teams to deliver complex systems and components and manage stakeholders" expectations well. - Understanding of different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. - Experienced with planning, estimating, organizing, and working on multiple projects. This role will be based out of Pune. Purpose of the role: To build and maintain systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architecture pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage appropriate data volumes and velocity and adhere to required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Takes responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision making within own area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver your work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. - Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. - Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. - Guide and persuade team members and communicate complex/sensitive information. - Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As an AWS Dataops Lead at Birlasoft, you will be responsible for configuring, deploying, monitoring, and managing AWS data platforms. Your role will involve managing data flows and dispositions in S3, Snowflake, and Postgres. You will also be in charge of user access and authentication on AWS, ensuring proper resource provisioning, security, and compliance. Your experience in GitHub integration will be valuable in this role. Additionally, familiarity with AWS native tools like Glue, Glue Catalog, CloudWatch, and CloudFormation (or Terraform) will be essential. You will also play a key part in assisting with backup and disaster recovery processes. Join our team and be a part of Birlasoft's commitment to leveraging Cloud, AI, and Digital technologies to empower societies worldwide and enhance business efficiency and productivity. With over 12,000 professionals and a rich heritage spanning 170 years, we are dedicated to building sustainable communities and driving innovation through our consultative and design-thinking approach.,

Posted 1 week ago

Apply

2.0 - 6.0 years

8 - 18 Lacs

Gurugram

Remote

Role Characteristics: Analytics team provides analytical support to multiple stakeholders (Product, Engineering, Business development, Ad operations) by developing scalable analytical solutions, identifying problems, coming up with KPIs and monitor those to measure impact/success of product improvements/changes and streamlining processes. This will be an exciting and challenging role that will enable you to work with large data sets, expose you to cutting edge analytical techniques, work with latest AWS analytics infrastructure (Redshift, s3, Athena, and gain experience in the usage of location data to drive businesses. Working in a dynamic start up environment will give you significant opportunities for growth within the organization. A successful applicant will be passionate about technology and developing a deep understanding of human behavior in the real world. They would also have excellent communication skills, be able to synthesize and present complex information and be a fast learner. You Will: Perform root cause analysis with minimum guidance to figure out reasons for sudden changes/abnormalities in metrics Understand objective/business context of various tasks and seek clarity by collaborating with different stakeholders (like Product, Engineering) Derive insights and putting them together to build a story to solve a given problem Suggest ways for process improvements in terms of script optimization, automating repetitive tasks Create and automate reports and dashboards through Python to track certain metrics basis given requirements Automate reports and dashboards through Python Technical Skills (Must have) B.Tech degree in Computer Science, Statistics, Mathematics, Economics or related fields 4-6 years of experience in working with data and conducting statistical and/or numerical analysis Ability to write SQL code Scripting/automation using python Hands on experience in data visualisation tool like Looker/Tableau/Quicksight Basic to advance level understanding of statistics Other Skills (Must have) Be willing and able to quickly learn about new businesses, database technologies and analysis techniques Strong oral and written communication Understanding of patterns/trends and draw insights from those Preferred Qualifications (Nice to have) Experience working with large datasets Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) Hands on experience on AWS services like lambda, step functions, Glue, EMR + exposure to pyspark What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee assistance program Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program) Internet reimbursement Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as VPF and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement NPS employer match Meal card for tax benefit Special benefits on salary account We are an equal opportunity employer and value diversity, inclusion and equity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Job Summary: We are looking for a talented Data Engineer cum Database Developer with a strong background in the banking sector. The ideal candidate will have experience with SQL Server, AWS PostgreSQL, AWS Glue, and ETL tools, along with expertise in data ingestion frameworks and Control-M scheduling Key Responsibilities: Design, develop, and maintain scalable data pipelines to support data ingestion and transformation processes. Collaborate with cross-functional teams to gather requirements and implement solutions tailored to banking applications. Utilize SQL Server and AWS PostgreSQL for database development, optimization, and management. Implement data ingestion frameworks to ensure efficient and reliable data flow. Develop and maintain ETL processes using AWS Glue / other ETL tool, Control-M for scheduling Ensure data quality and integrity through validation and testing processes. Monitor and optimize system performance to support business analytics and reporting needs. Document data architecture, processes, and workflows for reference and compliance purposes. Stay updated on industry trends and best practices related to data engineering and management. Qualifications: Bachelors degree in computer science, Information Technology, or a related field. 4+ years of experience in data engineering and database development, preferably in the banking sector. Proficiency in SQL Server and AWS PostgreSQL. Experience with Databricks/ AWS Glue or any other ETL tools (e.g., Informatica, ADF). Strong understanding of data ingestion frameworks and methodologies. Excellent problem-solving skills and attention to detail. Knowledge of Securitization in the banking industry would be plus Strong communication skills for effective collaboration with stakeholders. Familiarity with cloud-based data architectures and services. Experience with data warehousing concepts and practices. Knowledge of data privacy and security regulations in banking.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies