Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 7.0 years
3 - 8 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Qualifications Experience Required: 3 - 5 years being part of Agile teams 3 - 5 years of scripting 2+ years of AWS Hand on (S3, Lamda) 2+ years of experience with Pyspark or Python 2+ Experience with cloud technologies such as AWS. 2+ years of hand on with SQL Experience Desired: Experience with GITHUB Teradata, AWS (Glue, Lamda), Databricks, Snowflake, Angular, Rest API, Terraform, Jenkins (Cloudbees, Jenkinsfile/Groovy, password valt) Education and Training Required: Knowledge and/or experience with Health care information domains is a plus Computer science Good to have PrimarySkills: JavaScript, Python, PySpark, TDV, R, Ruby, Perl Lambdas, S3, EC2 Databricks, Snowflakes, Jenkins, Kafka, API Language, Angular, Selenium, AI & Machine Learning
Posted 1 week ago
2.0 - 7.0 years
40 - 45 Lacs
Chandigarh
Work from Office
Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness and latency. Participate in design reviews and code reviews Work with colleagues across global locations Troubleshoot and resolve production issues Performance Enhancements Required Skills & Qualifications Programming Skills Python / PySpark / Scala Database Skills Analytical Databases like Snowflakes / SQL Good to have - Elastic Search , Kafka , Nifi , Jupyter Notebooks, Good to have - Knowledge of AWS services like S3 / Glue / Athena / EMR / lambda Requirements Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness and latency. Participate in design reviews and code reviews Work with colleagues across global locations Troubleshoot and resolve production issues Performance Enhancements Required Skills & Qualifications Programming Skills Python / PySpark / Scala Database Skills Analytical Databases like Snowflakes / SQL Good to have - Elastic Search , Kafka , Nifi , Jupyter Notebooks, Good to have - Knowledge of AWS services like S3 / Glue / Athena / EMR / lambda
Posted 2 weeks ago
4.0 - 7.0 years
5 - 15 Lacs
Pune
Hybrid
Are you looking for a stable job with great benefits and pay? Consider becoming part of the Avient team! We know your time is valuable and you have a lot of job ads to review. Let us break down the important details! Position Summary Analytics Analyst: Our Avient IT team is seeking an Analyst to be part of a new analytics platform being developed for our global organization. In this role, you will focus on expanding our Snowflake data warehouse, leveraging your robust SQL skills to move analytics data from a raw state into a refined, analytics-ready state where it will be consumed by end users. This position supports a self-service analytics model by delivering well-defined and transformed datasets that provide consistency in reporting, facilitate efficient data mining, and help our business partners generate insights. Essential Functions: Collaborate with business associates and other IT resources to understand analytic needs, including localizations required by our global business. Translate those needs into technical solutions that properly move data through the Snowflake landscape. Maintain relevant technical documentation to ensure streamlined support and knowledge transfer within areas of responsibility. Support those processes as needed going forward. Prepare and execute unit and integration testing. Support user acceptance testing Provide hypercare support after each go-live. Troubleshoot and resolve analytic issues impacting the business. Other duties and projects as assigned. QUALIFICATIONS: Education and Experience: Bachelors of Science degree in Computer Science, Business, Information Systems or related business field required. 4+ years of experience in Analytics technical roles. Snowflake and SQL experience required. Experience with Qlik Replicate to ingest data to Snowflake is preferred. Experience with Agile methodology and Jira software is preferred. Ability to work independently and as a part of a global team is required. Ability to create and maintain robust documentation of IT processes is required. Strong collaboration skills within a team, across IT, and with business users is required. Strong troubleshooting and problem solving skills is required. Successful candidates will be driven to succeed while fostering a great place to work. Physical Demands : Vacancy located in Pune ,Kharadi location. About Us: Our purpose at Avient Corporation is to be an innovator of materials solutions that help our customers succeed, while enabling a sustainable world. Innovation goes far beyond materials science; its powered by the passion, creativity, and diverse expertise of 9,000 professionals worldwide. Whether youre a finance wizard, a tech enthusiast, an operational powerhouse, an HR changemaker, or a trailblazer in materials development, youll find your place at Avient. Join our global team and help shape the future with sustainable solutions that transform possibilities into realities. Your unique perspective could be the key to our next breakthrough! Avient Leadership Behaviors: We believe that all of our global employees are leaders and that the six most important behaviors for driving our strategy and culture are the same no matter if an employee is a leader of self, a leader of others, or a leader of the business. By playing to win, acting customer centric, driving innovation and profitable growth, collaborating seamlessly across Avient, and motivating and inspiring and developing others and yourself you will accelerate your ability to achieve Avient s strategic goals, to meet our customer needs, and to accomplish your career goals.
Posted 2 weeks ago
3 - 5 years
5 - 7 Lacs
Hyderabad
Work from Office
Position Summary: Cigna, a leading Health Services company, is looking for data engineers/developers in our Data & Analytics organization. The Full Stack Engineer is responsible for the delivery of a business need end-to end starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is Ownership & Accountability. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. He / She should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Behaviors of a Full Stack Engineer: Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers not institutionalized developers Job Description & Responsibilities: Minimize "meetings" to get requirements and have direct business interactions Write referenceable & modular code Design and architect the solution independently Be fluent in particular areas and have proficiency in many areas Have a passion to learn Take ownership and accountability Understands when to automate and when not to Have a desire to simplify Be entrepreneurial / business minded Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact Take risks and champion new ideas Experience Required: 3 Years experience in Pyspark/Python 3 Years Experience with cloud technologies such as AWS. 3 Years experience in SQL Experience Desired: Experience with GITHUB Databricks, Snowflakes, Jenkins, Kafka, API Language , Angular, Selenium, AI & Machine Learning Good to Have Education and Training Required: Knowledge and/or experience with Health care information domains is a plus Computer science Good to have Primary Skills: JavaScript, Python, PySpark, TDV, R, Ruby, Perl Lambdas, S3, EC2 Databricks, Snowflakes, Jenkins, Kafka, API Language , Angular, Selenium, AI & Machine Learning Additional Skills: Excellent troubleshooting skills Strong communication skills Fluent in BDD and TDD development methodologies Work in an agile CI/CD environment (Jenkins experience a plus)
Posted 2 months ago
2 - 4 years
4 - 6 Lacs
Uttar Pradesh
Work from Office
Working experience with Snowflake & ADF. * Experience in cloud based ETL implementations. * Good experience in working ETL tool and very good knowledge with SQL. * Creating of ETL Mappings to load data from files (& other feeds) and SQL server databases to Data Warehouse on cloud. * Create & execute SQL tasks. * Deploying ETL packages to SSIS production servers. * Understanding the SSIS packages, analyse queries in SSIS packages and build models using DBT. * Good Communication * Ability to work individually
Posted 2 months ago
7 - 10 years
20 - 27 Lacs
Bengaluru
Hybrid
Job Description: Sr. Manager Business intelligence Location: Bangalore /Rest of India Reports to : Director, Business Analytics & Insights A quick snapshot We're looking for a Senior Manager of Business Intelligence who will lead a team of Power BI developers and drive technical improvements in reporting and automation. This role will oversee prioritization, stakeholder engagement, and process improvements, ensuring that our analytics function delivers scalable and high-impact insights. A candidate who excels in Power BI, has experience working with complex data models, and can translate business needs into effective data solutions is the ideal candidate. Why its a big deal Our business relies on data to drive decision-making, and this role will be instrumental in making that data more accessible, scalable, and actionable. You will be responsible for driving the automation of critical reporting (eliminating manual PowerPoint work), enhancing data model integration with enterprise systems, and improving visualization best practices. Your work will have a direct impact on leadership decision-making, operational efficiency, and business performance with the increasing demand for high-quality insights Are you the person were looking for? Related experience. As a Senior Manager of Business Intelligence, I will have 8 years of experience in leading a team of BI developers or data analysts. You should have a strong background in Power BI development, data modeling, and DAX, as well as hands-on SQL (preferably in Snowflake) for querying and optimizing data for reporting. Stakeholder management . Your responsibilities will include stakeholder engagement, gathering requirements, and project management for enterprise-scale analytics solutions. You will be working closely with IT teams on data warehousing, integration, and ensuring seamless data flow. Implementation. Your experience in implementing data governance standards and ensuring data quality, consistency, and security across analytics solutions. Your expertise in creating and maintaining comprehensive documentation for BI processes, data models, and reporting solutions to ensure clarity, and compliance. Your experience with Power Automate and other automation tools is a plus. Team Leadership & Oversight. Manage and mentor a team of three Power BI developers and one data maintenance analyst, guiding best practices, prioritization, and technical problem-solving. Strategic Data Integration. Collaborate with the enterprise data analytics team to improve integration, scalability, and reusability of existing data models. Project & Stakeholder Management. Partner with enterprise data analytics team on data warehousing, analytics, and business stakeholders to define priorities, manage workloads, and ensure alignment with organizational goals. Requirements Gathering & Documentation. Support the business analytics team in translating vague or incomplete business requirements into actionable technical specifications. Document detailed upstream data needs for the IT data warehousing team. Automation & Efficiency Gains. Oversee and contribute to automating routine deliverables, including board reporting and senior leadership reporting, to enhance efficiency and accuracy. Qualifications A Bachelor's or master's degree in engineering or equivalent. A technical Degree with an MBA is strongly preferred. Here's what will give you an edge Excellent Communicator. You know what to say, and more importantly how to say it. You're comfortable talking across all levels of an organization with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Analytical thinker and creative problem solver. You can see issues holistically and follow the flow of the stack to get to the root of the matter -- a key skill in this role. But where you shine is with your ability to identify creative solutions to unique business requirements. Passion. We know it when we see it. Passion is not about saying how much you love what you do in your most excited voice. Passion is revealed in your true self. Its about what youve accomplished; how you want to grow; the ideas you have; and your philosophies. Its demonstrating through your words and your actions that you truly believe in what you do -- and where you work. That matters to you. And thats cool. #LI-SR1
Posted 3 months ago
14 - 20 years
40 - 60 Lacs
Hyderabad, Gurgaon
Work from Office
Responsibilities This role will deliver Data and AI Product Management (DPM) for various workstreams EDF Programs. DPM responsibilities include: Drive governance across all 6 pillars within EDF Coordinate data collection across all tracks to enable monthly EDF and Modern Data Intelligence Program (MDIP) Governance calls Report deliverables/costs against EDF Capital Expense Request for all tracks Ensure key global programs are enabled through EDFs readiness and scalability Execute Product Management responsibilities for each of the 6 EDF planks Partner with Sustain teams to develop/validate standardized reporting for ongoing run rate cost optimization of EDF and cost allocation to sectors/programs Govern report on decommission of Teradata components as a result of MDIP migration activities Drive transition to sustainable operations across various teams in a coordinated fashion Governance of potential migration of EDF due to RFP execution Drive sector data lake migration into EDF Coordination across all PEP teams and vendor partners: Identify cross-plank dependencies and support execution of tasks in a coordinated fashion Identify requirements outside of the Data/Analytics & AI space and support execution of tasks in a coordinated fashion Partner with the various owners of each plank and their team members to support delivery of milestones Plan and support monthly workshops along with tracking follow-ups and ensuring timely completion of follow-ups and coordinated efforts if multiple teams are required Ensure appropriate disposition as per assessment requirements and decommission in Teradata Support FinOPS initiative for EDF as a whole including partnering with various teams on cost allocation and optimization activities Manage stakeholder engagement Align sector business unit participants as required Determine and report on metrics for program during monthly status reporting Lead working sessions across teams to collaborate and implement process improvements Provides interface, communication and coordination across internal and strategic partner team(s) Use troubleshooting skills and work closely across multiple internal and external teams Program level status reporting and risk management: Lead identification of risks, actions, and issues through proactive communication and collaboration with stakeholders and various teams Manage delivery against key technical delivery milestones. Ensure technical documentation is readily available for data products produced and that value is communicated to key stakeholders. Qualifications Who are we looking for? A leader with strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities and projects at the same time 14+ years of Product Management, Data Analytics, Data Science or Data Management and Operations in business facing functions. 3+ years of experience leading/building advanced analytics and big data solutions, large scale data modeling, or building enterprise SaaS. 3 years of relevant experience in project management agile ways of working Scrum Master Experience working in a data-centric business environment. Excellent ability to identify sources of value from data analytics across core business domains (such as marketing, risk, and operations) and to define paths realizing that value. Familiarity with principles and tools for data governance and stewardship History of working in agile environments and successful estimation and delivery of complex products Effective communication skills and an ability to break down complex information into relevant and digestible points for both inside Data+Analytics and externally to the business. Excellent leadership skills, with a team-player attitude to drive the end-to-end implementation of use cases under time pressure. Demonstrated ability to drive business-oriented and innovative solutions using data science, feature engineering and machine learning. Demonstrated ability to effectively communicate with all levels of the organization and solve complex problems across teams. Strong people management skills with the ability to develop teams and cultivate talent, including teams composed of various experience levels. Mandatory 'Technical' Skills Experience with data & analytics Experience working with Azure DevOps or similar tools for tracking, developing, deploying software Experience in cost estimation for Platform-as-a-Service cloud hosting cost estimates (including both cloud hosting costs and software licensing) Should have knowledge of Azure Data bricks or snowflakes Knowledge of Azure or AWS Mandatory 'Non-Technical' Skills Experience leading cross-functional teams with diverse skillsets in multiple timezones Support identification and requirements gathering for new processes and services (or changes to existing processes and services) to improve operational support Excellent communication and strong interpersonal skills Experience with Agile and Hybrid development and deployment projects Agile or other project management certification is a plus Solves complex, politically sensitive problems across teams Strong analytical skills to support problem troubleshooting, resolution, and root cause determination to complex technology issues Ability to thrive in a fast pace and demanding work environment. Strong documentation skills and ability to explain complex technical concepts to non-technical personnel Good collaboration and partnership skills to foster key relationships with other technology, application and business teams Requisite skills to assist with influencing others in order gain consensus and alignment for key deliverables that span multiple teams and organizations Ability to leverage partnership with third-party providers to achieve business goals Ability to navigate complex cloud landscape and still deliver results Vendor partnership management experience Understanding of infrastructure and cloud technologies, principles, and methodologies Experience with running programs and initiatives with remote teams in multiple locations
Posted 3 months ago
10 - 15 years
45 - 60 Lacs
Bengaluru
Work from Office
You'll Get To: Provide technical expertise and leadership in technology direction, road-mapping, architecture definition, design, development, and delivery of enterprise-class solutions while adhering to timelines, coding standards, requirements, and quality. Architect, design, develop, test, troubleshoot, debug, optimize, scale, perform the capacity planning, deploy, maintain, and improve software applications, driving the delivery of high-quality value and features to Blacklines customers. Work collaboratively across the company to design, communicate and further assist with adoption of best practices in architecture and implementation. Deliver robust architectural solutions for complex design problems. Implement, refine, and enforce data engineering best practices to ensure that delivered features meet performance, security, and maintainability expectations. Research, test, benchmark, and evaluate new tools and technologies, and recommend ways to implement them in data platform. Identify and create solutions that are likely to contribute to the development of new company concepts while keeping in mind the business strategy, short- and long-term roadmap, and architectural considerations to support them in a highly scalable and easy extensible manner. Actively participate in research, development, support, management, and other company initiatives designing solutions to optimally address current and future business requirements and infrastructure plans. Inspire a forward-thinking team of developers, acting as an agent of change and evangelist for a quality-first culture within the organization. Mentor and coach key technical staff and guide them to solutions on complex design issues. Act as a conduit for questions and information flow when those outside of Engineering have ideas for new technology applications. Speak in terms relevant to audience, translating technical concepts into non-technical language and vice versa. Facilitate consensus building while striving for win/win scenarios and elicit value-add contributions from all team members in group settings. Maintain a strong sense of business value and return on investment in planning, design, and communication. Proactively identify issues, bottlenecks, gaps, or other areas of concern or opportunity and work to either directly affect change, or advocate for that change by working with peers and leadership to build consensus and act. Perform critical maintenance, deployment, and release support activities, including occasional off-hours support. What You'll Bring: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 10+ years as a data engineer. 10+ years of experience using RDBMS, SQL, NoSQL, Python, Java, or other programming languages is a plus. You'll Get To: 10+ years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL tools and Open source. 5+ years working experience with SQL and familiarity with Snowflake data warehouse, strong working knowledge in stored procedures, CTEs, and UDFs, RBAC Knowledge of data integration and data quality best practices Familiarity with data security and privacy regulations. Experience in working in a startup-type environment, good team player, and can work independently with minimal supervision Experience with cloud-native architecture and data solutions. Strong working knowledge in data modeling, data partitioning, and query optimization Demonstrated knowledge of development processes and agile methodologies. Strong analytical and interpersonal skills, comfortable presenting complex ideas in simple terms. Proficient in managing large volumes of data. Strong analytical and interpersonal skills, comfortable presenting complex ideas in simple terms. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. Experience in providing technical support and troubleshooting for data-related issues. Expertise with at least one cloud environment and building cloud native data services. Prior experience driving data governance, quality, security initiatives.
Posted 1 month ago
8 - 13 years
40 - 45 Lacs
Noida, Gurugram
Work from Office
Responsibilities: Design and articulate enterprise-scale data architectures incorporating multiple platforms including Open Source and proprietary Data Platform solutions - Databricks, Snowflake, and Microsoft Fabri c, to address customer requirements in data engineering, data science, and machine learning use cases. Conduct technical discovery sessions with clients to understand their data architecture, analytics needs, and business objectives Design and deliver proof of concepts (POCs) and technical demonstrations that showcase modern data platforms in solving real-world problems Create comprehensive architectural diagrams and i mplementation roadmaps for complex data ecosystems spanning cloud and on-premises environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on specific customer requirements Lead technical responses to RFPs (Request for Proposals), crafting detailed solution architectures, technical approaches, and implementation methodologies Create and review techno-commercial proposals, including solution scoping, effort estimation, and technology selection justifications Collaborate with sales and delivery teams to develop competitive, technically sound proposals with appropriate pricing models for data solutions Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, or a related technical field. 8+ years of experience in data architecture, data engineering, or solution architecture roles Proven experience in responding to RFPs and developing techno-commercial proposals for data solutions Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines Hands-on experience with multiple data platforms including Databricks, Snowflake, and Microsoft Fabric Strong understanding of big data technologies including Hadoop ecosystem, Apache Spark, and Delta Lake Experience with modern data processing frameworks such as Apache Kafka and Airflow Proficiency in cloud platforms ( AWS, Azure, GCP ) and their respective data services Knowledge of system monitoring and observability tools. Experience implementing automated testing frameworks for data platforms and pipelines Expertise in both relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB) Understanding of AI/ML technologies and their integration with data platforms Familiarity with data integration patterns, ETL/ELT processes , and data governance practices Experience designing and implementing data lakes, data warehouses, and machine learning pipelines Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) Strong problem-solving skills and ability to think creatively to address customer challenges Relevant certifications such as Databricks, Snowflake, Azure Data Engineer, or AWS Data Analytics are a plus Willingness to travel as required to meet with customers and attend industry events If interested plz contact Ramya 9513487487, 9342164917
Posted 1 month ago
12 - 19 years
45 - 55 Lacs
Noida, Hyderabad, Gurugram
Work from Office
Responsibilities Lead a team of data engineers, providing technical mentorship, performance management, and career development guidance Design and oversee implementation of modern data architecture leveraging cloud platforms ( AWS, Azure, GCP ) and industry-leading data platforms (Databricks, Snowflake, Microsoft Fabric) Establish data engineering best practices, coding standards, and technical documentation processes Develop and execute data platform roadmaps aligned with business objectives and technical innovation Optimize data pipelines for performance, reliability, and cost-effectiveness Collaborate with data science, analytics, and business teams to understand requirements and deliver tailored data solutions Drive adoption of DevOps and DataOps practices, including CI/CD, automated testing etc. Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, or a related technical field. 12+ years of experience in data engineering roles with at least 3 years in a leadership position Expert knowledge of big data technologies (Hadoop ecosystem) and modern data processing frameworks (Apache Spark, Kafka, Airflow) Extensive experience with cloud platforms (AWS, Azure, GCP) and cloud-native data services Hands-on experience with industry-leading data platforms such as Databricks, Snowflake, and Microsoft Fabric Strong background in both relational (PostgreSQL, MySQL) and NoSQL (MongoDB) database systems Experience implementing and managing data monitoring solutions (Grafana, Ganglia, etc.) Proven track record of implementing automated testing frameworks for data pipelines and applications Knowledge of AI/ML technologies and how they integrate with data platforms Excellent understanding of data modelling, ETL processes, and data warehousing concepts Outstanding leadership, communication, and project management skills If interested plz contact Ramya 9513487487, 9342164917
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2