Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Organization- Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Locations : Noida / Indore / Pune / Gurugram /Bengaluru Job Overview : We are seeking talented and experienced professionals in PySpark, Big Data technologies, and cloud solutions to join our team. The position spans across three levels: Engineer , Module Lead , and Lead Engineer . In these roles, you will be responsible for developing, optimizing, and managing ETL pipelines on cloud and on-premises environments using Big Data tools and AWS services. You will collaborate with cross-functional teams, ensuring that business requirements are met through efficient and scalable data solutions. Key Responsibilities: ETL Pipeline Development : Design and develop efficient ETL pipelines as per business requirements while adhering to development standards and best practices. AWS Integration & Testing : Perform integration testing on AWS environments and ensure seamless data operations across platforms. Estimation & Planning : Provide estimates for development, testing, and deployments across various environments. Peer Reviews & Best Practices : Participate in code peer reviews, ensuring code quality, adherence to best practices, and promoting continuous improvement within the team. Cost-Effective Solutions : Build and maintain cost-effective pipelines using AWS services like S3, IAM, Glue, EMR, Redshift, etc. Cloud Migrations : Support and manage cloud migrations from on-premise to cloud or between cloud environments. Orchestration & Scheduling : Manage job orchestration with tools like Airflow and any other relevant job scheduler. Required Skills & Qualifications: Experience : Engineer Level : 2-5 years of experience with PySpark, Hadoop, Hive, and related Big Data technologies. Module Lead Level : 4-6 years of experience with PySpark, Hadoop, Hive, and related Big Data technologies. Lead Engineer Level : 5-7 years of experience with PySpark, Hadoop, Hive, and related Big Data technologies. Technical Skills : Hands-on experience with PySpark (DataFrame and SparkSQL) , Hadoop , and Hive . Proficiency in Python and Bash scripting . Solid understanding of SQL and data warehouse concepts . Experience with AWS Big Data services (IAM, Glue, EMR, Redshift, S3, Kinesis) is a plus. Experience with Orchestration tools (e.g., Airflow, job schedulers) is beneficial. Analytical & Problem-Solving Skills : Strong analytical, problem-solving, and data analysis skills. Ability to think creatively and implement innovative solutions beyond readily available tools. Communication & Interpersonal Skills : Excellent communication, presentation, and interpersonal skills to collaborate with internal and external teams effectively. Desired Skills & Experience: Experience with migrating workloads between on-premise systems and cloud environments. Familiarity with cloud-native technologies and platforms. Knowledge of performance optimization techniques for distributed data processing. For Quick Response- Interested Candidates can directly share their resume along with the details like Notice Period, Current CTC and Expected CTC at anubhav.pathania@impetus.com
Posted 2 weeks ago
1.0 - 3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About the Organization- Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Job Description 1-3 years of good hands on exposure with Big Data technologies – pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) (good to have) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Roles and Responsibilities Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc. For Quick Response- Interested Candidates can directly share their resume along with the details like Notice Period, Current CTC and Expected CTC at anubhav.pathania@impetus.com
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Motion Designer - Product Videos (Cinematic Product Storytelling Specialist – Remote, India) 🚀 About ScaleKraft ScaleKraft is a fast-growing explainer video studio helping top-tier SaaS, AI, health tech, and fintech startups tell world-class product stories . We don’t make generic explainer videos. We craft cinematic UI launch trailers — sharp, sleek, emotionally engaging videos that make apps feel premium, powerful, and built for scale. Our work sits at the intersection of product design, storytelling, and motion . We're building a new creative category — and we’re looking for someone to help lead that movement. 🔥 The Role: Cinematic UI Animator & Motion Specialist We’re hiring a UI-focused Motion Designer to help us turn real product designs into beautiful, fluid, and high-converting cinematic launch videos for SaaS and AI companies. You’ll work closely with the founder and creative team, taking real UI from Figma and animating it with precision in After Effects. You must understand usability, motion logic, and product flow — and be obsessed with the details. This is a full-time, remote position (India-based, ideally near Hyderabad). You’ll need to be self-directed, async-ready, and excited about building something category-defining. 🎯 Responsibilities ✦ UI Animation & Motion Execution Take real Figma files and turn them into premium-quality motion videos Use AEUX or other plug-ins to move layered designs into After Effects Animate transitions, flows, and micro-interactions with clear logic and cinematic timing ✦ Cinematic Storytelling & Visual Polish Collaborate on narrative structure, pacing, and feature sequencing Execute camera movements, easing, blur, and polish that elevate the UX Add sound cues or rhythm (if capable), or coordinate with audio designer ✦ Workflow & Creative Systems Optimize and document the Figma → AE → Final export workflow Recommend and experiment with tools like Flow, EaseCopy, Ray Dynamic Color, Overlord, and Lottie Help define and evolve the ScaleKraft motion language ✦ (Optional) Light 3D Integration Add depth via simple 3D mockups or animated scenes (using Blender or C4D) Collaborate with 3D artists when needed 🧠 Skills We’re Looking For ✅ Must-Haves Advanced After Effects skills (camera, easing, expressions, compositing) Strong understanding of UI/UX animation logic and interface behavior Experience importing and animating from Figma (AEUX, Lottie, manual prep) Portfolio showing UI motion for SaaS, product, or tech brands Detail-obsessed — you notice a 2px misalignment or off easing instantly Ability to self-manage, hit deadlines, and work async (Slack, Notion, Loom) ✨ Nice-to-Have Bonus Skills Familiarity with Blender, C4D, Octane, Redshift, or Cycles Experience using plug-ins like Flow, EaseCopy, Ray Dynamic Color, Overlord Sound design instincts or music-timed motion Premiere Pro or DaVinci Resolve for final cuts Real-time engines (Unreal/Unity) are a plus 🧬 Culture Fit: What We Look For We play long-term games with long-term people . We're building a creative culture that rewards: Ownership and speed, not micromanagement Polish, taste, and visual discipline Curiosity about product, design systems, and motion theory People who care deeply about their craft 📦 Compensation & Perks Competitive salary based on skills and speed Bonus incentives for innovation and polish Flexible, async-friendly remote work Long-term growth path as we scale toward $2M ARR Opportunity to build a name in a new creative category 🧪 Hiring Process Portfolio Review Initial Interview (to align vision, style, and expectations) Paid Test Project (15–30 second UI motion video using Figma file) Trial Month (fully compensated) Full-Time Offer 🎥 To Apply Please send: A link to your motion portfolio (Required) A short note on your favorite UI animation project Tools and plug-ins you use (Figma → AE, and any 3D if applicable) 🚫 Applications without a portfolio will not be considered 🌟 Why Join ScaleKraft? Help define a new category in product-focused motion design Work with cutting-edge SaaS, AI, and tech companies Be part of a fast-moving, design-first studio Make launch videos that actually move the needle — not fluff
Posted 2 weeks ago
4.0 - 5.0 years
0 Lacs
Greater Kolkata Area
On-site
Role : Data Integration Specialist Experience : 4 - 5 Years Location : India Employment Type : Full-time About The Role We are looking for a highly skilled and motivated Data Integration Specialist with 4 to 5 years of hands-on experience to join our growing team in India. In this role, you will be responsible for designing, developing, implementing, and maintaining robust data pipelines and integration solutions that connect disparate systems and enable seamless data flow across the enterprise. You'll play a crucial part in ensuring data availability, quality, and consistency for various analytical and operational needs. Key Responsibilities ETL/ELT Development : Design, develop, and optimize ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes using industry-standard tools and technologies. Data Pipeline Construction : Build and maintain scalable and efficient data pipelines from various source systems (databases, APIs, flat files, streaming data, cloud sources) to target data warehouses, data lakes, or analytical platforms. Tool Proficiency : Hands-on experience with at least one major ETL tool such as Talend, Informatica PowerCenter, SSIS, Apache NiFi, IBM DataStage, or similar platforms. Database Expertise : Proficient in writing and optimizing complex SQL queries across various relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL) and NoSQL databases. Cloud Data Services : Experience with cloud-based data integration services on platforms like AWS (Glue, Lambda, S3, Redshift), Azure (Data Factory, Synapse Analytics), or GCP (Dataflow, BigQuery) is highly desirable. Scripting : Develop and maintain scripts (e.g., Python, Shell scripting) for automation, data manipulation, and orchestration of data processes. Data Modeling : Understand and apply data modeling concepts (e.g., dimensional modeling, Kimball/Inmon methodologies) for data warehousing solutions. Data Quality & Governance : Implement data quality checks, validation rules, and participate in establishing data governance best practices to ensure data accuracy and reliability. Performance Tuning : Monitor, troubleshoot, and optimize data integration jobs and pipelines for performance, scalability, and reliability. Collaboration & Documentation : Work closely with data architects, data analysts, business intelligence developers, and business stakeholders to gather requirements, design solutions, and deliver data assets. Create detailed technical documentation for data flows, mappings, and transformations. Problem Solving : Identify and resolve complex data-related issues, ensuring data integrity and consistency. Qualifications Education : Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related quantitative field. Experience : 4 to 5 years of dedicated experience in data integration, ETL development, or data warehousing. Core Skills : Strong proficiency in SQL and at least one leading ETL tool (as listed above). Programming : Hands-on experience with Python or Shell scripting for data manipulation and automation. Databases : Solid understanding of relational database concepts and experience with various database systems. Analytical Thinking : Excellent analytical, problem-solving, and debugging skills with attention to detail. Communication : Strong verbal and written communication skills to articulate technical concepts to both technical and non-technical audiences. Collaboration : Ability to work effectively in a team environment and collaborate with cross-functional teams. Preferred/Bonus Skills Experience with real-time data integration or streaming technologies (e.g., Kafka, Kinesis). Knowledge of Big Data technologies (e.g., Hadoop, Spark). Familiarity with CI/CD pipelines for data integration projects. Exposure to data visualization tools (e.g., Tableau, Power BI). Experience in specific industry domains (e.g., Finance, Healthcare, Retail) (ref:hirist.tech)
Posted 2 weeks ago
6.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : AWS Data Engineer Location : Hyderabad (Hybrid) Experience Required : 6 to 7 Years Employment Type : Permanent Job Description We are looking for a highly skilled and experienced AWS Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building scalable data pipelines, data integration, and hands-on experience with AWS cloud services. You will play a critical role in designing, developing, and optimizing data solutions to support analytics and business intelligence efforts across the Responsibilities : Design, develop, and maintain scalable and robust data pipelines using AWS cloud-native tools. Build and manage ETL/ELT processes to ingest and transform structured and unstructured data. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver reliable datasets. Implement best practices for data engineering, including data quality, governance, testing, and security. Monitor data workflows and troubleshoot issues across the pipeline lifecycle. Work on performance tuning, scalability, and cost optimization of data processes on AWS. Create and maintain technical documentation related to data pipelines and infrastructure. Contribute to continuous integration and deployment (CI/CD) automation for data Skills : 6 to 7 years of overall experience in data engineering. Strong expertise in AWS data services such as AWS Glue, Redshift, S3, Lambda, Step Functions, and Athena. Proficiency in Python or Scala for ETL development and data transformation. Solid experience with SQL for data manipulation and querying. Experience with data lake and data warehouse architecture. Good understanding of data modeling concepts and performance tuning. Hands-on experience with version control tools like Git and CI/CD pipelines. Familiarity with tools like Airflow, DBT, or similar workflow orchestration frameworks is a plus. Excellent problem-solving, analytical, and communication to Have : Experience with big data technologies like Spark, Kafka, or Hadoop. Exposure to DevOps practices and infrastructure-as-code tools like Terraform or CloudFormation. Knowledge of data security, GDPR, and compliance Qualification : Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field. Why Join Us? Work with a dynamic and collaborative team focused on cutting-edge data solutions. Opportunity to contribute to high-impact projects in a cloud-first environment. Flexible hybrid working model with a long-term career growth path. (ref:hirist.tech)
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
Salesforce is looking for software developers who are eager to make a significant impact with their code for users, the company, and the industry. You will collaborate with a team of top-notch engineers to create innovative features that our customers will appreciate, adopt, and utilize, all while maintaining the stability and scalability of our trusted CRM platform. The software engineer role at Salesforce involves architecture, design, implementation, and testing to ensure the delivery of high-quality products. Your responsibilities as a Lead Engineer will include: - Developing new components in a rapidly growing technology market to enhance scalability and efficiency. - Writing high-quality, production-ready code that can cater to millions of users. - Making design decisions based on performance, scalability, and future growth. - Contributing to all stages of the software development life cycle, including design, implementation, code reviews, automation, and testing. - Building efficient components and algorithms in a microservice multi-tenant SaaS cloud environment. - Conducting code reviews, mentoring junior engineers, and offering technical guidance to the team. Required Skills: - Proficiency in multiple programming languages and platforms. - Over 10 years of experience in software development. - Profound understanding of object-oriented programming and various scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++. - Strong SQL skills and familiarity with relational and non-relational databases like Postgres, Trino, Redshift, and MongoDB. - Experience in developing SAAS applications on public cloud infrastructures like AWS, Azure, and GCP. - Knowledge of queues, locks, scheduling, event-driven architecture, workload distribution, as well as relational and non-relational databases. - Understanding of software development best practices and demonstration of leadership skills. - Degree or equivalent relevant experience required. Experience will be assessed based on core competencies relevant to the role. Benefits & Perks: Salesforce offers a comprehensive benefits package, including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. Access to world-class enablement and on-demand training through Trailhead.com. Opportunities to engage with executive thought leaders and receive regular 1:1 coaching with leadership. Participation in volunteer activities and Salesforce's 1:1:1 model for community outreach. For further information, please visit https://www.salesforcebenefits.com/.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As a MySQL DBA Lead Engineer, you will be responsible for managing and optimizing databases in a cloud environment. With over 15 years of experience, you will lead the team in ensuring the smooth operation of databases like MySQL, PostgreSQL, SQL Server, and AWS RDS Aurora. Your expertise in SQL, NoSQL, and various databases will be essential in maintaining high performance and reliability. Your role will involve advanced database management tasks such as backup, recovery, and tuning. You will have hands-on experience with MySQL, PostgreSQL, and MariaDB, including installation, configuration, and fine-tuning. Additionally, you will be proficient in MySQL replication concepts and performance tuning, ensuring optimal database performance. In the AWS cloud environment, you will demonstrate your expertise in managing MariaDB in EC2 and RDS, as well as PostgreSQL RDS and Aurora. Your skills in database services like RDS MySQL, Aurora MySQL, and Redshift will be crucial in configuring, installing, and managing databases for efficient operation. Moreover, your experience in migration projects from on-premise to cloud environments will be valuable. You will be adept at managing SQL Server databases and ensuring their smooth operation across different life cycle environments. Your exposure to project deliverables and timely delivery will play a key role in project success. Overall, your extensive experience with database technologies, cloud services, and operating systems will be instrumental in ensuring the reliability and performance of databases in a cloud environment.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should have proficiency in SQL, Python, and/or other relevant programming languages. Experience with DBT and similar data transformation platforms is required, along with experience with Airflow or similar data orchestration tools. It is essential to have familiarity with data warehouse solutions such as Snowflake, Redshift. You must demonstrate a proven ability to work autonomously and manage workload effectively, as well as proven experience working with cross-functional teams. Familiarity with iPaaS solutions like Workato, Celigo, MuleSoft is a plus. Experience with enterprise business applications such as Salesforce, NetSuite, SuiteProjects Pro, Jira is also a plus. Knowledge of cloud platforms like AWS, GCP, Azure, and related services is advantageous. The ideal candidate should have 8+ years of overall experience with 3-5 years specifically in data engineering or a related field.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are a seasoned Confluent & Oracle EBS Cloud Engineer with over 10 years of experience, responsible for leading the design and implementation of scalable, cloud-native data solutions. Your role involves modernizing enterprise data infrastructure, driving real-time data streaming initiatives, and migrating legacy ERP systems to AWS-based platforms. Your key responsibilities include architecting and implementing cloud-based data platforms using AWS services such as Redshift, Glue, DMS, and Data Lake solutions. You will lead the migration of Oracle E-Business Suite or similar ERP systems to AWS while ensuring data integrity and performance. Additionally, you will design and drive the implementation of Confluent Kafka for real-time data streaming across enterprise systems. It is essential for you to define and enforce data architecture standards, governance policies, and best practices. Collaborating with engineering, data, and business teams to align architecture with strategic goals is also a crucial aspect of your role. Furthermore, you will optimize data pipelines and storage for scalability, reliability, and cost-efficiency. To excel in this role, you must possess 10+ years of experience in data architecture, cloud engineering, or enterprise systems design. Deep expertise in AWS services, including Redshift, Glue, DMS, and Data Lake architectures, is required. Proven experience with Confluent Kafka for real-time data streaming and event-driven architectures is essential. Hands-on experience in migrating large-scale ERP systems (e.g., Oracle EBS) to cloud platforms is a must. Strong understanding of data governance, security, and compliance in cloud environments, as well as proficiency in designing scalable, fault-tolerant data systems, are also necessary. Preferred qualifications include experience with data modeling, metadata management, and lineage tracking, familiarity with infrastructure-as-code and CI/CD practices, and strong communication and leadership skills to guide cross-functional teams.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You are an experienced Data Engineer who will be responsible for leading the end-to-end migration of the data analytics and reporting environment to Looker at Frequence. Your role will involve designing scalable data models, translating business logic into LookML, and empowering teams across the organization with self-service analytics and actionable insights. You will collaborate closely with stakeholders from data, engineering, and business teams to ensure a smooth transition to Looker, establish best practices for data modeling, governance, and dashboard development. Your responsibilities will include: - Leading the migration of existing BI tools, dashboards, and reporting infrastructure to Looker - Designing, developing, and maintaining scalable LookML data models, dimensions, measures, and explores - Creating intuitive, actionable, and visually compelling Looker dashboards and reports - Collaborating with data engineers and analysts to ensure consistency across data sources - Translating business requirements into technical specifications and LookML implementations - Optimizing SQL queries and LookML models for performance and scalability - Implementing and managing Looker's security settings, permissions, and user roles in alignment with data governance standards - Troubleshooting issues and supporting end users in their Looker adoption - Maintaining version control of LookML projects using Git - Advocating for best practices in BI development, testing, and documentation You should have: - Proven experience with Looker and deep expertise in LookML syntax and functionality - Hands-on experience building and maintaining LookML data models, explores, dimensions, and measures - Strong SQL skills, including complex joins, aggregations, and performance tuning - Experience working with semantic layers and data modeling for analytics - Solid understanding of data analysis and visualization best practices - Ability to create clear, concise, and impactful dashboards and visualizations - Strong problem-solving skills and attention to detail in debugging Looker models and queries - Familiarity with Looker's security features and data governance principles - Experience using version control systems, preferably Git - Excellent communication skills and the ability to work cross-functionally - Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift) - Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker - Experience working in agile data teams and managing BI projects - Familiarity with dbt or other data transformation frameworks At Frequence, you will be part of a dynamic, diverse, innovative, and friendly work environment that values creativity and collaboration. The company embraces differences and believes they drive creativity and innovation. The team consists of individuals from varied backgrounds who are all trail-blazing team players, thinking big and aiming to make a significant impact. Please note that third-party recruiting agencies will not be involved in this search.,
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Roles & Responsibilities Job Description – Sr. Data Engineer We are looking for a Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines. Integration of data from multiple sources or vendors to provide the holistic insights from data. You are expected to build and manage Data warehouse solutions, designing data models, creating ETL processes, implementing data quality mechanisms etc. Performs EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. Should have experience in client interaction. Experience in mentoring juniors and providing required guidance. Required Technical Skills Extensive experience in Python, Pyspark, SQL. Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Snowflake database. Must have strong hands-on experience in Azure and its services. Must be proficient in Databricks, Redshift, ADF etc. Hands-on experience in cloud services like Azure, AWS- S3, Glue, Lambda, CloudWatch, Athena. Sound knowledge in end-to-end Data management, data ops, quality and data governance. Knowledge of SFDC, Waterfall/ Agile methodology. Strong knowledge of Pharma domain/ life sciences commercial data operations. Qualifications Bachelor’s or master’s Engineering/ MCA or equivalent degree. 5-7 years of relevant industry experience as Data Engineer. Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales etc. High motivation, good work ethic, maturity, self-organized and personal initiative. Ability to work collaboratively and providing the support to the team. Excellent written and verbal communication skills. Strong analytical and problem-solving skills.
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent thats bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com. Looking to jump-start your career? We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegenes high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. Must Have You Will Be Responsible For Core Activities to be performed: Conduct functional and/or integration testing of multiple products in web, desktop, mobile, and multi-platform application environment. Creation of test cases based on the requirements defined, Wireframes, Use-Case Diagrams. Coordinate on frequent updates based on the changes and reuse wherever applicable across the product life cycle. Execute test cases at different stages of the Project Life Cycle. Familiar with different types of testing: Black Box (viz) Functionality, User Interface, Smoke, Database, Integration, and Regression. Experience in data validation, ETL testing, and writing SQL scripts for validation on Redshift or Snowflake. Logging of defects against functionality verification with detailing, validation upon bug fix, and tracking the defect towards closure. Deriving test data for all scenarios identified. Provide timely status report to QA Lead and/or Manager. Responsible for verifying that the release has been implemented successfully. Expertise in SQL queries on Redshift or Snowflake. Exposure to Qlik Reports testing. Conversant with adequate exploratory testing knowledge. UAT Support & Guidance. Works Closely With The Test Lead To Plan and Review releases. Conduct testing as needed. Conduct supporting documentation review. Proven ability to manage and prioritize multiple, diverse projects simultaneously. Must be flexible, independent, and self-motivated. Excellent verbal and written communication skills. Good to have EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations - Bangalore, KA, IN
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Basic Functions/ Job Responsibility Industry Advisory & Thought Leadership: Act as a domain expert on data & AI platforms (cloud, hybrid, on-prem), data lakes, warehouses, mesh, and lakehouses. Advise companies (from startups to small and medium enterprises) on modern data architecture, integration, governance, and tooling. Support sector-specific data strategies (healthcare, BFSI, manufacturing, etc.) through working groups and roundtables. Research & Framework Development Lead development of whitepapers, policy notes, and best practices on data interoperability, privacy-preserving data systems, and AI-ready data infrastructures. Track global benchmarks on data infrastructure and translate insights into actionable frameworks for companies. Stakeholder Engagement Collaborate with government bodies, regulators, and industry leaders to co-develop Indias data economy vision. Engage with ecosystem playerscloud providers, analytics firms, data exchange platforms, startups, and academia. Capacity Building Support NASSCOMs data & AI skilling and literacy efforts through content, training modules, and evangelism. Mentor industry partners and incubated startups on building scalable and ethical data architectures. Knowledge, Skills, Qualifications, Experience 3 to 5 years of experience Deep understanding of: Modern data platforms (e.g., Snowflake, Databricks, Google BigQuery, AWS Redshift, Azure Synapse) Data integration and pipeline orchestration (e.g., Airflow, dbt, Kafka, Fivetran) Governance and compliance (e.g., data cataloging, lineage, access control, anonymization) Knows using AI models for building products and solutions. Knowledge on choosing right models based on the solution proposed. Knowledge on the AI tools (MCP, Langchain, Langraph) and its integration techniques. Knowledge on AI products, copilots that can be used for better productivity. Experience advising or working with cross-functional stakeholders technical teams, policy makers, and business leaders. Knowledge of data standards, Indias data protection laws, and international practices (e.g., GDPR, Open Data). Ability to suggest development process, fine-tune, and evaluate AI/ML models and algorithms to fit client needs. Build proofs-of-concept (PoCs), prepare demos, and support pilot projects in Generative AI (e.g. chatbots, text/image generation, data extraction). Assist with data wrangling, feature engineering, model training, and fine-tuning. Stay up to date on AI trends, tools, and frameworks to suggest relevant solutions for clients. Engage in knowledge sharing through whitepapers, internal training sessions, or publishing short research findings to maintain thought leadership. Assist in training the upcoming AI and GenAI developers during the Nasscom developer sessions. Preferred Qualifications: Experience working in consulting, think tanks, industry bodies, or tech product companies. Exposure to industry data challenges in sectors like BFSI, health, retail, or public sector. Familiarity with AI/ML platforms and responsible AI frameworks is a strong plus. Graduate/Postgraduate degree in Computer Science, Data Science, Engineering, or related field. Locations Bangalore
Posted 2 weeks ago
3.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Software Engineer - AWS Glue Job Date: Jun 22, 2025 Job Requisition Id: 61707 Location: Bangalore, KA, IN Bangalore, KA, IN Hyderabad, IN Indore, IN Pune, MH, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire AWS Glue Professionals in the following areas : 3 or more years’ experience in AWS Glue + Redshift + Python 3+ years of experience in engineering with experience in ETL type work with cloud databases Data management / data structures Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data Spark Experience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance. SCC/Git Must be experienced in the use of source code control systems such as Git ETL Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc. Programming Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala. Must have experience in using REST APIs SQL Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling. AWS General knowledge of AWS Stack (EC2, S3, EBS, …) IT Process Compliance SDLC experience and formalized change controls Working in DevOps teams, based on Agile principles (e.g. Scrum) ITIL knowledge (especially incident, problem and change management) IT Process Compliance SDLC experience and formalized change controls Proficiency in PySpark for distributed computation Familiarity with Postgres and ElasticSearch At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 2 weeks ago
3.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - AWS+Python+Pyspark Job Date: Jun 22, 2025 Job Requisition Id: 61668 Location: Bangalore, KA, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire AWS Professionals in the following areas : AWS Data Engineer JD As Below: Primary skillsets :AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset- Any ETL Tool, Github, DevOPs(CI-CD) Experience: 3-4yrs Degree in computer science, engineering, or similar fields Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing , developing, testing and supporting data pipelines and applications. 3+ years working experience in data integration and pipeline development. 3+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus. 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server) Strong real-life experience in python development especially in PySpark in AWS Cloud environment Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like Airflow AWS cloud services: RDS, AWS Lambda, AWS Glue, AWS Athena, EMR (equivalent tools in the GCP stack will also suffice) Good to Have : Snowflake, Palantir Foundry At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 2 weeks ago
5.0 - 8.0 years
5 - 7 Lacs
Cochin
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure. Your key responsibilities Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3. Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools. Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies. Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence. Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases. Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures. Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation. Own the creation and governance of SOPs, runbooks, and technical documentation for data operations. Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture. Skills and attributes for success Expertise in AWS data services and ability to lead architectural discussions. Analytical thinker with the ability to design and optimize end-to-end data workflows. Excellent debugging and incident resolution skills in large-scale data environments. Strong leadership and mentoring capabilities, with clear communication across business and technical teams. A growth mindset with a passion for building reliable, scalable data systems. Proven ability to manage priorities and navigate ambiguity in a fast-paced environment. To qualify for the role, you must have 5–8 years of experience in DataOps, Data Engineering, or related roles. Strong hands-on expertise in Databricks. Deep understanding of ETL pipelines and modern data integration patterns. Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments. Experience in Airflow or AWS Data Pipeline for orchestration and scheduling. Advanced knowledge of IICS or similar ETL tools for data transformation and automation. SQL skills with emphasis on performance tuning, complex joins, and window functions. Technologies and Tools Must haves Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks Strong experience with Amazon Redshift for scalable data warehousing and analytics Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms Good to have Exposure to Power BI or Tableau for data visualization Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms Understanding of DevOps and CI/CD automation tools for data engineering workflows SQL familiarity across large datasets and distributed databases What we look for Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
6.0 years
20 - 35 Lacs
India
On-site
Data Engineer (6–8 Years) | Hyderabad, India | SaaS Product | MongoDB | Finance Automation Resourcedekho is hiring for a leading client in the agentic AI-based finance automation space. We’re looking for a passionate and experienced Data Engineer to join a high-impact team in Hyderabad. Why Join Us? Open budget for the right talent—compensation based on your expertise and interview performance. Work with cutting-edge technologies in a high-growth, product-driven environment. Collaborate with top minds from reputed institutions (IIT/IIM or similar). What You’ll Do: Design, build, and optimize robust data pipelines for ingesting, processing, and transforming data from diverse sources. Implement and maintain ETL workflows using tools like Debezium, Kafka, Airflow, Jenkins . Develop and optimize SQL/NoSQL schemas, queries, and stored procedures for efficient data retrieval. Work with both relational (MySQL, PostgreSQL) and NoSQL (MongoDB, DocumentDB) databases. Design and implement scalable data warehouse solutions for analytics and ML applications. Collaborate with data scientists and ML engineers to prepare data for AI/ML models. Ensure data quality, monitoring, and alerting for accuracy and reliability. Optimize query performance through indexing, partitioning, and query refactoring. Maintain comprehensive documentation for data models, pipelines, and processes. Stay updated with the latest in data engineering tech and best practices. What We’re Looking For: 6+ years of experience in data engineering or related roles. Strong proficiency in SQL and experience with MySQL, PostgreSQL . Hands-on expertise with MongoDB (or AWS DocumentDB)— mandatory . Proven experience designing and optimizing ETL processes (Kafka, Debezium, Airflow, etc.). Solid understanding of data modeling, warehousing, and performance optimization. Experience with AWS data services (RDS, Redshift, S3, Glue, Kinesis, ELK stack). Proficiency in at least one programming language ( Python, Node.js, Java ). Experience with Git and CI/CD pipelines. Bachelor’s degree in Computer Science, Engineering, or related field. SaaS product experience is a must. Preference for candidates from reputed colleges (IIT/IIM or similar) and with stable career history. Bonus Points For: Experience with graph databases (Neo4j, Amazon Neptune). Knowledge of big data tech (Hadoop, Spark, Hive, data lakes). Real-time/streaming data processing. Familiarity with data governance, security, Docker, Kubernetes. FinTech or financial back-office domain experience. Startup/high-growth environment exposure. Ready to take your data engineering career to the next level? Apply now or reach out to us at career@resourceDekho.com to learn more! Please note: Only candidates with relevant SaaS product experience and strong MongoDB skills will be considered. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,500,000.00 per year Application Deadline: 22/07/2025 Expected Start Date: 18/08/2025
Posted 2 weeks ago
15.0 years
2 - 6 Lacs
Hyderābād
On-site
DESCRIPTION At AWS, we are looking for a Delivery Practice Manager with a successful record of leading enterprise customers through a variety of transformative projects involving IT Strategy, distributed architecture, and hybrid cloud operations. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from across the AWS customer base and are obsessed about strong success for the Customer. Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs. 10034 Key job responsibilities - Engage customers - collaborate with enterprise sales managers to develop strong customer and partner relationships and build a growing business in a geographic territory, driving AWS adoption in key markets and accounts. - Drive infrastructure engagements - including short on-site projects proving the value of AWS services to support new distributed computing models. - Coach and teach - collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Amazon Databases – RDS/Aurora/DynamoDB/Redshift, Amazon Elastic Compute Cloud (EC2), Amazon Simple Storage Service (S3), AWS Identity and Access Management(IAM), etc. - Deliver value - lead high quality delivery of a variety of customized engagements with partners and enterprise customers in the commercial and public sectors. - Lead great people - attract top IT architecture talent to build high performing teams of consultants with superior technical depth, and customer relationship skills - Be a customer advocate - Work with AWS engineering teams to convey partner and enterprise customer feedback as input to AWS technology roadmaps Build organization assets – identify patterns and implement solutions that can be leveraged across customer base. Improve productivity through tooling and process improvements. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field. 15+ years of IT implementation and/or delivery experience, with 5+ years working in an IT Professional Services and/or consulting organization; and 5+ years of direct people management leading a team of consultants. Deep understanding of cloud computing, adoption strategy, transition challenges. Experience managing a consulting practice or teams responsible for KRAs. Ability to travel to client locations to deliver professional services as needed PREFERRED QUALIFICATIONS Demonstrated ability to think strategically about business, product, and technical challenges. Vertical industry sales and delivery experience of contemporary services and solutions.Experience with design of modern, scalable delivery models for technology consulting services. Business development experience including complex agreements w/ integrators and ISVs .International sales and delivery experience with global F500 enterprise customers and partners Direct people management experience leading a team of at least 20 or manager of manager experience in a consulting practice. Use of AWS services in distributed environments with Microsoft, IBM, Oracle, HP, SAP etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad IND, KA, Bangalore IND, MH, Maharashtra IND, HR, Gurugram Customer Service
Posted 2 weeks ago
5.0 years
2 - 4 Lacs
Hyderābād
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are the only professional services organization who has a separate business dedicated exclusively to the financial services marketplace. Join Digital Engineering Team and you will work with multi-disciplinary teams from around the world to deliver a global perspective. Aligned to key industry groups including Asset management, Banking and Capital Markets, Insurance and Private Equity, Health, Government, Power and Utilities, we provide integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning and individually tailored coaching you will experience ongoing professional development. That’s how we develop outstanding leaders who team to deliver on our promises to all of our stakeholders, and in so doing, play a critical role in building a better working world for our people, for our clients and for our communities. Sound interesting? Well, this is just the beginning. Because whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. We’re seeking a versatile Full Stack Developer with hands-on experience in Python (including multithreading and popular libraries) ,GenAI and AWS cloud services. The ideal candidate should be proficient in backend development using NodeJS, ExpressJS, Python Flask/FastAPI, and RESTful API design. On the frontend, strong skills in Angular, ReactJS, TypeScript, etc.EY Digital Engineering is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional capability and product knowledge. The Digital Engineering (DE) practice works with clients to analyse, formulate, design, mobilize and drive digital transformation initiatives. We advise clients on their most pressing digital challenges and opportunities surround business strategy, customer, growth, profit optimization, innovation, technology strategy, and digital transformation. We also have a unique ability to help our clients translate strategy into actionable technical design, and transformation planning/mobilization. Through our unique combination of competencies and solutions, EY’s DE team helps our clients sustain competitive advantage and profitability by developing strategies to stay ahead of the rapid pace of change and disruption and supporting the execution of complex transformations. Your key responsibilities Application Development: Design and develop cloud-native applications and services using AWS services such as Lambda, API Gateway, ECS, EKS, and DynamoDB, Glue, Redshift, EMR. Deployment and Automation: Implement CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy to automate application deployment and updates. Architecture Design: Collaborate with architects and other engineers to design scalable and secure application architectures on AWS. Performance Tuning: Monitor application performance and implement optimizations to enhance reliability, scalability, and efficiency. Security: Implement security best practices for AWS applications, including identity and access management (IAM), encryption, and secure coding practices. Container Services Management: Design and deploy containerized applications using AWS services such as Amazon ECS (Elastic Container Service), Amazon EKS (Elastic Kubernetes Service), and AWS Fargate. Configure and manage container orchestration, scaling, and deployment strategies. Optimize container performance and resource utilization by tuning settings and configurations. Application Observability: Implement and manage application observability tools such as AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack (Elasticsearch, Logstash, Kibana). Develop and configure monitoring, logging, and alerting systems to provide insights into application performance and health. Create dashboards and reports to visualize application metrics and logs for proactive monitoring and troubleshooting. Integration: Integrate AWS services with application components and external systems, ensuring smooth and efficient data flow. Troubleshooting: Diagnose and resolve issues related to application performance, availability, and reliability. Documentation: Create and maintain comprehensive documentation for application design, deployment processes, and configuration. Skills and attributes for success Required Skills: AWS Services: Proficiency in AWS services such as Lambda, API Gateway, ECS, EKS, DynamoDB, S3, and RDS, Glue, Redshift, EMR. Backend: Python (multithreading, Flask, FastAPI), NodeJS, ExpressJS, REST APIs Frontend: Angular, ReactJS, TypeScript Cloud Engineering : Development with AWS (Lambda, EC2, S3, API Gateway, DynamoDB), Docker, Git, etc. Proven experience in developing and deploying AI solutions with Python, JavaScript Strong background in machine learning, deep learning, and data modelling. Good to have: CI/CD pipelines, full-stack architecture, unit testing, API integration Security: Understanding of AWS security best practices, including IAM, KMS, and encryption. Observability Tools: Proficiency in using observability tools like AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack. Container Orchestration: Knowledge of container orchestration concepts and tools, including Kubernetes and Docker Swarm. Monitoring: Experience with monitoring and logging tools such as AWS CloudWatch, CloudTrail, or ELK Stack. Collaboration: Strong teamwork and communication skills with the ability to work effectively with cross-functional teams. Preferred Qualifications: Certifications: AWS Certified Solutions Architect – Associate or Professional, AWS Certified Developer – Associate, or similar certifications. Experience: At least 5 Years of experience in an application engineering role with a focus on AWS technologies. Agile Methodologies: Familiarity with Agile development practices and methodologies. Problem-Solving: Strong analytical skills with the ability to troubleshoot and resolve complex issues. Education: Degree: Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field, or equivalent practical experience What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning : You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
5.0 - 7.0 years
4 - 7 Lacs
Raipur
On-site
Internship Raipur, Bengaluru Posted 2 months ago SUBHAG HEALTHTECH PVT LTD SUBHAG® HealthTech has developed a cutting-edge medical device that enables couples to perform Intra Uterine Insemination (IUI) discreetly in a home environment. This innovative solution aims to address male infertility issues without the need to visit a doctor, expanding infertility treatment through home-based IUI and disrupting the IVF market. With a focus on revolutionizing infertility treatment in India, SUBHAG® HealthTech is dedicated to providing comfortable and effective medical solutions to couples facing fertility challenges. Job Description Essential Responsibilities Create, analyze and maintain explanatory/predictive Data Science models using healthcare data in the Infertility segment Work with real-world case studies in Data Science and a chance to implement various modeling techniques Get real-life experience of working with big data in the digital marketing sphere. Opportunity to independently execute and lead analytical projects and assignments Help solve some challenging Healthcare related digital marketing problems globally. Transform business question into data requirements; collect and merge the data; analyse the data, link it to the business reality and present the results Develop predictive models and machine learning algorithms to study the change in physician prescribing behaviour, as well as Subhag Healthtech, integrated campaign response behaviour. Build analysis to understand user engagement and behaviors across various Subhag Healthtech products. Build expertise in data preparation, data visualisations and transformations through SAS, R, Tableau and other analytical tools. Required Qualifications Must have: B. Tech/B.E. / MSc Computer science/ Statistics/other relevant specialization from premier institutes 5 -7 years’ experience explicitly in Data Analytics – Design, Architecture and the software development lifecycle. Expertise in AWS compute, storage & Big data services EMR, Spark, Redshift, S3, Glue, Step Functions, Lambda serverless, EKS, SageMaker, Dynamo DB etc. Experience in one or more ETL like Informatica Experience in Data warehousing, Data & Dimensional modeling and concepts. Experience in one or more data visualization tool like Tableau, Spotfire etc. Good understanding of enterprise source systems ERP, CRM-etc. Experience in Python or Core Java, Java Web Services development with REST APIs Demonstrated ability to translate user needs and requirements into design alternatives Hands-on experience in Natural Language Processing and Deep Learning projects Experience in healthcare sector Requirement Knowledge of existing and potential clients and ensuring business growth opportunities in relation to company’s strategic plans. Well connected with Pharma/ Surgical Equipment Distributors, Hospitals, Doctors in the region Excellent knowledge of MS Office and using the Internet Familiarity with CRM practices along with ability to build productive business professional relationships Highly motivated and target driven with a proven track record in marketing Excellent marketing and communication skills Prioritizing, time management and organizational skills Relationship management skills and openness to feedback. To apply for this job email your details to hr@subhag.in
Posted 2 weeks ago
0 years
8 - 10 Lacs
Chennai
Remote
: ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. : Business Technology ZS’s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation. What you’ll do : Work with business stakeholders to understand their business needs. Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. Clean, filter, and validate data to ensure it meets quality and format standards. Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. Optimize data pipelines and data storage for performance and efficiency. Participate in code reviews and provide meaningful feedback to other team members. Provide technical support and troubleshoot issue(s). What you’ll bring : Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Experience working in the AWS cloud platform. Data engineer with expertise in developing big data and data warehouse platforms. Experience working with structured and semi-structured data. Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. Experience working directly with technical and business teams. Able to create technical documentation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Additional Skills : Experience in data modeling. Certified in AWS platform for Data Engineer skills. Experience with ITSM processes/tools such as ServiceNow, Jira Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow : Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com
Posted 2 weeks ago
3.0 - 6.0 years
9 - 11 Lacs
Chennai
Remote
: ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. : Role Overview: The Tableau Developer will be responsible for creating data visualizations, dashboards, and reporting solutions using Tableau Desktop, Server, and Prep to support business analytics and operational reporting needs. What you’ll do: Design and develop interactive dashboards and data visualizations using Tableau . Develop data models, calculations, and KPIs in line with business requirements. Connect to diverse data sources (AWS Redshift, RDS, flat files, APIs) and optimize data extracts. Collaborate with business and data engineering teams to define reporting specifications. Optimize report performance and implement best practices for visualization and user experience. Manage Tableau Server content deployment and governance standards. What you’ll bring: 3-6 years of Tableau development experience. Strong knowledge of data visualization best practices and dashboard performance tuning. Proficiency in SQL and familiarity with cloud-based data sources (AWS preferred). Experience with Tableau Prep and Tableau Server management is a plus. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams : Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Are you passionate about data engineering in Cloud technologies in most effective way and does creating innovative data management and reporting solutions excite you? Are you looking for an opportunity where you can both practice your technical trade while adding to your leadership, consulting, and delivery experience to turn a career corner? If you've got technical chops, consulting skills and are a passionate data architect, please read on... Responsibilities Accountable for delivery of data deliveries for top insurance and financial services clients world-wide Implement data warehousing solutions in AWS cloud, analytics data stores in advanced analytics platforms Work closely with BI technology vendors for any need arising due to production support, product upgrades, or development. Document data sources on client project and perform gap analysis and identify inflow of bad data into Cloud data warehouse and develop recommendations Development of database solutions by designing proposed system; defining database physical structure and functional capabilities, security, back-up, and recovery specifications to internal and external stakeholders Work closely with ETL/ ELT team for implementing data transformation requirements into the cloud data warehouse. Qualifications 3+ years of experience in Python programming, specifically in complex data loading using Python. 3+ years of hands on (strong) ETL is a must and prior experience in any RDMS experience is a must (Postgres is preferable) 2+ years of mandatoryhands on experience in AWS cloud technologies, CI/CD(GIT/ BitBucket) and managing data in AWS 2+ years of hands-on experience in Snowflake or AWS Redshift is required. Data Mapping expertise, ability to understand business requirements and derive data dictionary and data assets requirement for ETL/ ELT and DBA groups. Excellent communication skills, both written and verbal and attention to detail Analytical skills is must to understand the requirement and work on the tasks as an individual contributor. Ability to collaborate with peers and develop good working relationships. Knowledge of data warehousing on the AWS platform is required. AWS Certification especially Cloud Practitioner completion is desired . Hands on experience with Snowflake will be an asset. Hands on experience on Linux shell scripting is preferable. About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice
Posted 2 weeks ago
3.0 - 7.0 years
5 - 20 Lacs
Noida
On-site
Lead Assistant Manager EXL/LAM/1411628 Healthcare AnalyticsNoida Posted On 03 Jul 2025 End Date 17 Aug 2025 Required Experience 3 - 7 Years Basic Section Number Of Positions 3 Band B2 Band Name Lead Assistant Manager Cost Code D010360 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 500000.0000 - 2000000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Healthcare Organization Healthcare Analytics LOB Healthcare D&A SBU Healthcare Analytics Country India City Noida Center Noida-SEZ BPO Solutions Skills Skill AWS SQL PYSPARK AWS GLUE LAMBDA AWS SERVICES ATHENA GIT Minimum Qualification B.TECH/B.E Certification No data available Job Description Job Title: Data Engineer - PySpark, Python, SQL, Git, AWS Services – Glue, Lambda, Step Functions, S3, Athena. Job Description: We are seeking a talented Data Engineer with expertise in PySpark, Python, SQL, Git, and AWS to join our dynamic team. The ideal candidate will have a strong background in data engineering, data processing, and cloud technologies. You will play a crucial role in designing, developing, and maintaining our data infrastructure to support our analytics. Responsibilities: 1. Develop and maintain ETL pipelines using PySpark and AWS Glue to process and transform large volumes of data efficiently. 2. Collaborate with analysts to understand data requirements and ensure data availability and quality. 3. Write and optimize SQL queries for data extraction, transformation, and loading. 4. Utilize Git for version control, ensuring proper documentation and tracking of code changes. 5. Design, implement, and manage scalable data lakes on AWS, including S3, or other relevant services for efficient data storage and retrieval. 6. Develop and optimize high-performance, scalable databases using Amazon DynamoDB. 7. Proficiency in Amazon QuickSight for creating interactive dashboards and data visualizations. 8. Automate workflows using AWS Cloud services like event bridge, step functions. 9. Monitor and optimize data processing workflows for performance and scalability. 10. Troubleshoot data-related issues and provide timely resolution. 11. Stay up-to-date with industry best practices and emerging technologies in data engineering. Qualifications: 1. Bachelor's degree in Computer Science, Data Science, or a related field. Master's degree is a plus. 2. Strong proficiency in PySpark and Python for data processing and analysis. 3. Proficiency in SQL for data manipulation and querying. 4. Experience with version control systems, preferably Git. 5. Familiarity with AWS services, including S3, Redshift, Glue, Step Functions, Event Bridge, CloudWatch, Lambda, Quicksight, DynamoDB, Athena, CodeCommit etc. 6. Familiarity with Databricks and it’s concepts. 7. Excellent problem-solving skills and attention to detail. 8. Strong communication and collaboration skills to work effectively within a team. 9. Ability to manage multiple tasks and prioritize effectively in a fast-paced environment. Preferred Skills: 1. Knowledge of data warehousing concepts and data modeling. 2. Familiarity with big data technologies like Hadoop and Spark. 3. AWS certifications related to data engineering. Workflow Workflow Type L&S-DA-Consulting
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France