Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Platform Engineer, you will play a crucial role in developing and maintaining Terraform modules and patterns for AWS and Azure. Your responsibilities will include creating platform landing zones, application landing zones, and deploying application infrastructure. Managing the lifecycle of these patterns will be a key aspect of your role, encompassing tasks such as releases, bug fixes, feature integrations, and updating test cases. You will be responsible for developing and releasing Terraform modules, landing zones, and patterns for both AWS and Azure platforms. Providing ongoing support for these patterns, including bug fixing and maintenance, will be essential. Additionally, you will need to integrate new features into existing patterns to enhance their functionality and ensure that updated and new patterns meet the current requirements. Updating and maintaining test cases for patterns will also be part of your responsibilities to guarantee reliability and performance. To qualify for this role, you should have at least 5 years of experience in AWS and Azure cloud migration. Proficiency in Cloud compute (such as EC2, EKS, Azure VM, AKS) and Storage (like s3, EBS, EFS, Azure Blob, Azure Managed Disks, Azure Files) is required. A strong knowledge of AWS and Azure cloud services, along with expertise in Terraform, is essential. Possessing AWS or Azure certification would be advantageous for this position. Key Qualifications: - 5+ years of AWS/Azure cloud migration experience - Proficiency in Cloud compute and Storage - Strong knowledge of AWS and Azure cloud services - Expertise in Terraform - AWS/Azure certification preferred Mandatory Skills: Cloud AWS DevOps (Minimum 5 Years of Migration Experience) Relevant Experience: 5-8 Years This is a Full-time, Permanent, or Contractual / Temporary job with a contract length of 12 months. Benefits: - Health insurance - Provident Fund Schedule: - Day shift, Monday to Friday, Morning shift Additional Information: - Performance bonus - Yearly bonus,
Posted 11 hours ago
5.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Educational Requirements MBA,MSc,MTech,Bachelor Of Science (Tech),Bachelor of Engineering,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities Over 7 + years of IT experience which includes 5+ years of Extensive experience as a React JS Developer and 5 years of Experience as a UI/UX Developer /API developerExtensive experience in developing web pages and single page app using HTML/HTML5, DHTML CSS3, JavaScript, React JS 16 + , Redux, Node.js, express.js, JESTExperienced in MERN stack development Mongo dB, Express.js, Node, and ReactJS.Experience in all phase of SDLC like Requirement Analysis, Implementation and Maintenance, and extensive experience with Agile and SCRUM.Extensive knowledge in developing single - page applications (SPAs).Working knowledge of Web protocols and standards REST, SSO etcGood Expertise in development and debugging tools such as VSCode, git, npm, chrome developer tools, Familiar with creating Custom Reusable React Components Library.Involved in writing application level code to interact with APIs, RESTful Web Services using AJAX, JSON.Knowledge of utilizing cloud technologies including Amazon Web Services (AWS), Microsoft Azure Blob and Pivotal Cloud Foundry (PCF).Expertise in RESTful services to integrate between Application to ApplicationExperience with front-end development with back-end system integration.Proficient in using JEST framework for unit testingGood Experience in Bug tracking tools like JIRA,HP Quality CenterAbility to work effectively while working as a team member as well as individually.Excellent communication and Inter-Personal Skills, well organized, goal oriented. Additional Responsibilities: What’s in it for youWe are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our worldOur people, our culture, our voices, and our passions. What’s better than building the next big thingIt’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purposeto navigate further, together.EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National OriginAt Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch.Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department. Technical and Professional Requirements: Participate in the estimation of work product in order to provide right information to TL/PM for overall project estimation Understand the requirements both functional and non-functional by going through the specifications and with inputs from business analysts and participate in creating high level estimate as well as translating the same to systems requirements in order to create a systems requirements document and participate effectively in the design, development and testing phases of the project Develop and review artifacts (Code, Documentation, Unit test scripts) conduct reviews for self and peers, conduct unit test and document unit test results for complex programs in order to build the application and make it ready for validation/delivery Preferred Skills: Technology-Reactive Programming-react JS
Posted 1 week ago
3.0 - 6.0 years
14 - 18 Lacs
Kochi
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
3.0 - 6.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
4.0 - 9.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Data Engineer - Python We are looking for a Data Engineer with experience in building data pipelines and implementing AI/ML solutions in Azure. This role involves integrating structured and unstructured data sources for efficient retrieval and processing in support of OpenAI-based RAG pipelines Responsibilities Design and implement data pipelines using Azure Data Factory or equivalent. Set up and manage SQL databases and integrate with Azure AI Search. Prepare and store embeddings for RAG using Azure Vector Search. Ensure data quality, versioning, and security with Azure Blob, Key Vault, and Monitoring. Collaborate with prompt engineers and backend teams to optimize data flow. Collaborating with various stakeholders to determine software requirements. Design and develop logical flows for each business requirement Prepare technical documentation for each feature and guide/coach the junior developers during the implementation phase. Work closely with the other members of the backend and frontend team to integrate different components into the applications. Researching and implementing new technologies for the product Troubleshooting and resolving issues with coding or design. Testing the final product to ensure it is completely functional and meets requirements. Requirements SQL, Azure Data Factory, Azure Blob, Azure Key Vault Experience with vector stores and embeddings Familiarity with Azure OpenAI and AI Search Data modeling and performance tuning REST API integration and scripting Experience with building software using Python. Experience with building applications with micro services & serverless architecture 4+ years of experience in software development roles Good communication skills Technical Skills Language/ Framework: Python, Azure Data Factory, One Lake, Azure Blob, Azure Key Vault Database: Postgres, SQL Server OS: Unix/Linux/Windows/Serverless Others(good to have) Machine Learning Model Building, NLP – NLTK, Spacy, NumPy etc. Location Bangalore
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
kolkata, west bengal
On-site
Wipro Limited is a leading technology services and consulting company dedicated to creating innovative solutions for clients" most complex digital transformation needs. With a vast portfolio of capabilities in consulting, design, engineering, and operations, Wipro aims to help clients achieve their boldest ambitions and establish future-ready, sustainable businesses. As a company with over 230,000 employees and business partners spanning 65 countries, Wipro is committed to supporting customers, colleagues, and communities in navigating an ever-changing world. Role Purpose: The primary objective of this role is to facilitate process delivery by ensuring the daily performance of Production Specialists, addressing technical escalations, and enhancing the technical capabilities of the Production Specialists. Key Requirements: - 7-10 years of software development experience - Proficiency in .NET Core 5.0 or above (Web, API) - Proficiency in Azure Services (Serverless Computing, Azure Functions, Azure durable functions, Azure Storage, Azure Service Bus, Azure Blob, Azure Table storage, Azure APIM) - Strong Object-Oriented Programming (OOPS) design skills and software design patterns proficiency - Strong knowledge of SQL server - Experience in Microservices architecture-based development - Good communication skills Responsibilities: - Handle technical escalations by diagnosing and troubleshooting client queries effectively - Manage and resolve technical roadblocks/escalations within SLA and quality requirements - Escalate unresolved issues to TA & SES when necessary - Provide product support and resolutions to clients through guided step-by-step solutions - Troubleshoot client queries professionally and courteously - Offer alternative solutions to retain customers" business - Communicate effectively with listeners and situations - Conduct triage-based trainings to bridge skill gaps and enhance technical knowledge of Production Specialists - Stay current with product features through relevant trainings - Identify common problems, recommend resolutions, and document findings - Continuously update job knowledge through self-learning opportunities and networks maintenance Performance Parameters: 1. Process: No. of cases resolved per day, compliance with process and quality standards, meeting SLAs, Pulse score, customer feedback 2. Team Management: Productivity, efficiency, absenteeism 3. Capability Development: Triages completed, Technical Test performance Join Wipro in reinventing your world and be a part of a company that encourages constant evolution and personal reinvention. Applications from individuals with disabilities are warmly welcomed.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Snowflake Architect with over 12 years of experience in data warehousing, cloud architecture, and Snowflake implementations. Your expertise lies in designing, optimizing, and managing large-scale Snowflake data platforms to ensure scalability, performance, and security. You are expected to possess deep technical knowledge of Snowflake, cloud ecosystems, and data engineering best practices. Your key responsibilities will include leading the design and implementation of Snowflake data warehouses, data lakes, and data marts. You will define best practices for Snowflake schema design, clustering, partitioning, and optimization. Additionally, you will architect multi-cloud Snowflake deployments with seamless integration and design data sharing, replication, and failover strategies for high availability. You will be responsible for optimizing query performance using Snowflake features, implementing automated scaling strategies for dynamic workloads, and troubleshooting performance bottlenecks in large-scale Snowflake environments. Furthermore, you will architect ETL/ELT pipelines using Snowflake, Coalesce, and other tools, integrate Snowflake with BI tools, ML platforms, and APIs, and implement CDC, streaming, and batch processing solutions. In terms of security, governance, and compliance, you will define RBAC, data masking, row-level security, and encryption policies in Snowflake. You will ensure compliance with GDPR, CCPA, HIPAA, and SOC2 regulations and establish data lineage, cataloging, and auditing using Snowflake's governance features. As a leader, you will mentor data engineers, analysts, and developers on Snowflake best practices, collaborate with C-level executives to align Snowflake strategy with business goals, and evaluate emerging trends for innovation. Your required skills and qualifications include over 12 years of experience in data warehousing, cloud architecture, and database technologies, 8+ years of hands-on Snowflake architecture and administration experience, and expertise in SQL and Python for data processing. Deep knowledge of Snowflake features, experience with cloud platforms, and strong understanding of data modeling are also essential. Certification as a Snowflake Advanced Architect is a must. Preferred skills include knowledge of DataOps, MLOps, and CI/CD pipelines, as well as familiarity with DBT, Airflow, SSIS, and IICS.,
Posted 1 week ago
6.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 weeks ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 3 weeks ago
5.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Mandatory Skills Terraform modules Devops, AWS and Azure Years of exp needed Minimum of 8 Years Work Location pls mention city and preferably office address as well Chennai Cloud Platform Engineer Chennai JC-75156 /75158 Band B3 No. of position - 3 Position Overview: Cloud Platform Engineer will be responsible for developing and maintaining Terraform modules and patterns for AWS and Azure. These modules and patterns will be used for platform landing zones, application landing zones, and application infrastructure deployments. The role involves managing the lifecycle of these patterns, including releases, bug fixes, feature integrations, and updates to test cases. Key Responsibilities: Develop and release Terraform modules, landing zones, and patterns for AWS and Azure. Provide lifecycle support for patterns, including bug fixing and maintenance. Integrate new features into existing patterns to enhance functionality. Release updated and new patterns to ensure they meet current requirements. Update and maintain test cases for patterns to ensure reliability and performance. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services. Expert in terraform. AWS/Azure certification preferred. Provide customer support/ service on the DevOps tools Timely support internal & external customers escalations on multiple platforms Troubleshoot the various problems that arise in implementation of DevOps tools across the project/ module Perform root cause analysis of major incidents/ critical issues which may hamper project timeliness, quality or cost Develop alternate plans/ solutions to be implemented as per root cause analysis of critical problems Follow escalation matrix/ process as soon as a resolution gets complicated or isnt resolved Provide knowledge transfer, sharing best practices with the team and motivate Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Cloud AWS Devops. Experience5-8 Years.
Posted 3 weeks ago
5.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Mandatory Skills Terraform modules Devops, AWS and Azure Years of exp needed Minimum of 8 Years Work Location pls mention city and preferably office address as well Chennai Rates including mark up - 170 K/M Cloud Platform Engineer Chennai Band B3 No. of position - 3 Position Overview: Cloud Platform Engineer will be responsible for developing and maintaining Terraform modules and patterns for AWS and Azure. These modules and patterns will be used for platform landing zones, application landing zones, and application infrastructure deployments. The role involves managing the lifecycle of these patterns, including releases, bug fixes, feature integrations, and updates to test cases. Key Responsibilities: Develop and release Terraform modules, landing zones, and patterns for AWS and Azure. Provide lifecycle support for patterns, including bug fixing and maintenance. Integrate new features into existing patterns to enhance functionality. Release updated and new patterns to ensure they meet current requirements. Update and maintain test cases for patterns to ensure reliability and performance. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services. Expert in terraform. AWS/Azure certification preferred. Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Cloud AWS Devops. Experience5-8 Years.
Posted 3 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
Chennai
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-15yrs Work Location :Chennai Job Description: Required Technical Skill Set: Azure Native Technology, synapse and data bricks, Python Desired Experience Range: 8+ Years Location of Requirement: Chennai Required Skills: Previous experience as a data engineer or in a similar role Must have experience with MS Azure services such as Data Lake Storage, Data Factory, Databricks, Azure SQL Database, Azure Synapse Analytics, Azure Functions Technical expertise with data models, data mining, analytics and segmentation techniques Knowledge of programming languages and environments such as Python, Java, Scala, R, .NET/C# Hands-on experience with SQL database design Great numerical and analytical skills Degree in Computer Science, IT, or similar field; a master's is a plus Experience working in integrating Azure PaaS services Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601 )to proceed further.
Posted 3 weeks ago
6.0 - 11.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled Snowflake Ingress/Egress Specialist with 6 to 12 years of experience to manage and optimize data flow into and out of our Snowflake data platform. This role involves implementing secure, scalable, and high-performance data pipelines, ensuring seamless integration with upstream and downstream systems, and maintaining compliance with data governance policies. Roles and Responsibility Design, implement, and monitor data ingress and egress pipelines in and out of Snowflake. Develop and maintain ETL/ELT processes using tools like Snowpipe, Streams, Tasks, and external stages (S3, Azure Blob, GCS). Optimize data load and unload processes for performance, cost, and reliability. Coordinate with data engineering and business teams to support data movement for analytics, reporting, and external integrations. Ensure data security and compliance by managing encryption, masking, and access controls during data transfers. Monitor data movement activities using Snowflake Resource Monitors and Query History. Job Bachelor's degree in Computer Science, Information Systems, or a related field. 6-12 years of experience in data engineering, cloud architecture, or Snowflake administration. Hands-on experience with Snowflake features such as Snowpipe, Streams, Tasks, External Tables, and Secure Data Sharing. Proficiency in SQL, Python, and data movement tools (e.g., AWS CLI, Azure Data Factory, Google Cloud Storage Transfer). Experience with data pipeline orchestration tools such as Apache Airflow, dbt, or Informatica. Strong understanding of cloud storage services (S3, Azure Blob, GCS) and working with external stages. Familiarity with network security, encryption, and data compliance best practices. Snowflake certification (SnowPro Core or Advanced) is preferred. Experience with real-time streaming data (Kafka, Kinesis) is desirable. Knowledge of DevOps tools (Terraform, CI/CD pipelines) is a plus. Strong communication and documentation skills are essential.
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Vadodara
Work from Office
Cloud Consultant | Document Management Software | Docsvault 2022-05-10T08:18:34-04:00 Summary We are looking for Azure Consultant with relevant experience of 4+ years in Microsoft Azure to design and implement cloud architecture for our new cloud application. Responsibilities and Duties Design, consult and advise on state-of-the-art technical solutions on Azure that address our Angular/.netcore/MySQL application requirements for scalability, reliability, security, and performance. Taking complete ownership of design, deployment, implementation, security, and maintenance plans of our application on Microsoft Azure Evaluate stakeholder requirements, develop, communicate, and present solutions to the dev team and management. Support our teams in driving documentation, written requirements, and strategic direction to transfer knowledge and responsibility of our cloud infrastructure. Recommend client value creation initiatives and implement industry best practices. Provide valuable contributions and adaptation to post implementation support (long term). Demonstrate a hands-on ability to deliver appropriate technical solutions within project and program time frames. Desired Candidate Profile Significant experience in solution design, architecture, and hands on delivery within the Azure cloud & Azure DevOps environment (Advanced Azure knowledge) Extensive experience in relevant hosting solutions like Azure Functions, Azure DB for MySQL, Azure Blobs, modern network design, and senior technical support role. An ongoing willingness to learn, upskill in cutting edge technologies, train, coach, and mentor.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Jo b Description: Should be an experienced professional with Data Engineering background. Should be able to work without much guidance. Design, develop, and maintain scalable ETL pipelines using Azure Services to process, transform, and load large datasets into AWS Datalake or other data stores. Collaborate with cross-functional teams, including data architects, analysts, and business stakeholders, to gather data requirements and deliver efficient data solutions. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Key Skill Sets Required Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is desirable. Strong experience in common data warehouse modelling principles including Kimball, Inmon. Knowledge in Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics, Power BI is desirable Working knowledge of Python is desirable • Experience developing security models.
Posted 1 month ago
6.0 - 7.0 years
14 - 18 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
6.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
#Employment Type: Contract Skills Azure Data Factory SQL Azure Blob Azure Logic Apps
Posted 1 month ago
2.0 - 6.0 years
2 - 6 Lacs
Vadodara, Gujarat, India
On-site
Internal Job Title: Data Pipeline Analyst Business: Lucy Electric Manufacturing & Technologies India Location: Halol, Vadodara, Gujarat Job Reference No: 3940 Job Purpose: To support the provision of key business insights by building and maintaining data pipelines and structures, using programming tools and languages including Python and MS SQL. Job Context: Working closely with the Data & Analytics Development Lead and cross-functional teams to ensure a coordinated approach to Business Intelligence delivery. The role involves providing information across multiple businesses for comparative and predictive analysis, highlighting opportunities for business process improvement. Job Dimensions: The role is an onsite role, with flexible attendance at our office in Vadodara, India, to support business engagement. There is an occasional need to visit other sites and business partners at their premises to build stakeholder relationships or to attend specific industry events, globally. Key Accountabilities: Capturing requirements and preparing specifications for data pipelines and reporting Developing prioritised BI outputs to agreed quality and security standards Assisting the Data & Analytics team with technical integration of data sources Conducting training and coaching sessions to support business users understanding of data Collaborating with the wider business to promote appropriate use of data & analytics tools Maintaining operational and customer-facing documentation for support processes and defined project deliverables Improving analytics capabilities for BI services in an evergreen ecosystem Troubleshooting production issues and coordinating with the wider IT Team to resolve incidents and complete tasks using IT Service Management tools, as part of a cross-functional team Qualifications, Experience & Skills: A bachelor's degree (or equivalent professional qualifications and experience) in a relevant stream Effective communication skills in English 4 years of experience in data transformation and/or creating data pipelines, including Python Good understanding of Microsoft data storage tools such as Azure Blob and Data Lake Working knowledge of statistical methods to validate findings, ensure data accuracy and drive data-driven decision making Knowledge of Exploratory Data Analysis to identify key insights, potential issues, and areas for further investigation Conduct design reviews, propose enhancements, and design best-fit solutions Provide Business as Usual support (BAU) and support ad-hoc reporting requirements alongside project work Identify process improvement and efficiency opportunities through data analysis and recommend automation or optimisation General understanding of a company's value chain and basic manufacturing industry terminology Good to Have Skills: ETL using Data Pipeline tools (e.g., SSIS, Azure Data Factory or similar) Microsoft SQL, Dynamics 365 (D365), Microsoft Dataverse REST APIs, CI/CD on Azure DevOps Data Quality, Data Sensitivity, Near Time and Real Time data processing Behavioral Competencies: Good interpersonal skills to enable process improvement through positive interaction Problem-solving mindset with a desire to share knowledge and support others Customer-oriented, flexible, and focused on stakeholder satisfaction Does this sound interesting We would love to hear from you. Our application process is quick and easy. Apply today!
Posted 1 month ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Snowflake! Responsibilities: . Ability to design and implement effective analytics solutions and models with Snowflake . Hand-on experience in Snowflake SQL, Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . Should be able to implement Snowpipe, Stage and file upload to Snowflake database . Hand-on Experience on any RBDMS/NoSQL database with strong SQL writing skills . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Hands-on Experience on Azure Blob Qualifications we seek in you! Minimum Qualifications / Skills . SnowSQL, SnowPipe, Tasks, Streams, Time travel . Certified SnowPro Core . Good Understanding of Data Warehousing & Reporting tools . Able to work on own initiative and as a team player . Good organizational skills with cultural awareness and sensitivity . Education: ME/ M.Tech./ MS (Engg/ Sciences) and BE/BTech (Engineering) . Industry: Manufacturing/Industrial Behavioral Requirements: . Lives client&rsquos core values of courage and curiosity to deliver the best business solutions for EL-Business . Ability to o work in diversified teams o convey messages and ideas clearly to the users and project members o listen, understand, appreciate, and appropriately respond to the users . Excellent team player with strong oral and written communication skills . Possess strong time management skills . Keeps up-to-date and informed of client technology landscape and client IS Strategy planned or ad-hoc changes. Preferred Skills/Qualifications Azure storage services such as Blob, Data Lake, Cosmos DB and SQL Server. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai
Work from Office
Cloud Migration Specialist Chennai Rates including mark up - 170K/ No Of position - 3 Mandatory skills -AWS and Azure Cloud migrations, on-prem applications to AWS/Azure Experience - 5 - 8 years Position Overview: We are seeking a skilled Cloud Engineer with expertise in AWS and Azure Cloud migrations. The ideal candidate will lead the migration of on-premises applications to AWS/Azure, optimize cloud infrastructure, and ensure seamless transitions. Key Responsibilities: Plan and execute migrations of on-prem applications to AWS/Azure. Utilize or Develop migration tools for large-scale application migrations. Design and implement automated application migrations. Collaborate with cross-functional teams to troubleshoot and resolve migration issues. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services and migration tools. Expert in terraform. AWS/Azure certification preferred. Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Cloud Azure Admin. Experience5-8 Years.
Posted 1 month ago
0.0 - 2.0 years
1 - 1 Lacs
Bengaluru
Work from Office
Join us as a Software Dev Intern! Work on Next.js, React, Node.js, and PostgreSQL to build scalable apps, APIs, and user-friendly UIs. Collaborate, write clean code, and help shape real features in a fast-paced environment.
Posted 1 month ago
6.0 - 7.0 years
8 - 9 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
6.0 - 7.0 years
8 - 9 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted 1 month ago
6.0 - 7.0 years
8 - 9 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough