Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 5.0 years
3 - 7 Lacs
Mumbai, Pune, Chennai
Work from Office
Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Mumbai Pune Exp:- 4 to 5 Years Location:- Pune/Mumbai/Bangalore/Chennai JD : Azure Data Engineer with QA: Must Have - Azure Data Bricks, Azure Data Factory, Spark SQL Years - 4-5 years of development experience in Azure Data Bricks Strong experience in SQL along with performing Azure Data bricks Quality Assurance. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Kind Note: Please apply or share your resume only if it matches the above criteria.
Posted 1 month ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience3-5 Years.
Posted 1 month ago
5.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.
Posted 1 month ago
2.0 - 6.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Job Area: Information Technology Group, Information Technology Group > Systems Analysis General Summary: General Summary: Proven experience in testing, particularly in data engineering. Strong coding skills in languages such as Python/ Java Proficiency in SQL and NoSQL databases. Hands on experience in data engineering, ETL processes, and data warehousing QA activities. Design and develop automated test frameworks for data pipelines and ETL processes. Use tools and technologies such as Selenium, Jenkins, and Python to automate test execution. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with data technologies like Data Bricks, Hadoop, PySpark, and Kafka. Understanding of CI/CD pipelines and DevOps practices. Knowledge of containerization technologies like Docker and Kubernetes. Experience with performance testing and monitoring tools. Familiarity with version control systems like Git. Exposure to Agile and DevOps methodologies. Experience on Test cases creation, Functional and regression testing, Defects creation and analyzing root cause. Good verbal and written communication, Analytical, and Problem-solving skills. Ability to work with team members around the globe (US, Taiwan, India, etc...), to provide required support. Overall 10+ years of experience. Principal Duties and Responsibilities: Manages project priorities, deadlines, and deliverables with minimal supervision. Determines which work tasks are most important for self and junior personnel, avoids distractions, and independently deals with setbacks in a timely manner. Understands relevant business and IT strategies, contributes to cross-functional discussion, and maintains relationships with IT and customer peers. Seeks out learning opportunities to increase own knowledge and skill within and outside of domain of expertise. Serves as a technical lead on a sub system or small feature, assigns work to a small project team, and works on advanced tasks to complete a project. Communicates with project lead via email and direct conversation to make recommendations about overcoming impending obstacles. Adapts to significant changes and setbacks in order to manage pressure and meet deadlines independently. Collaborates with more senior Systems Analysts and/or business partners to document and present recommendations for improvements to existing applications and systems. Acts as a technical resource for less knowledgeable personnel Manages projects of small to medium size and complexity, performs tasks, and applies expertise in subject area to meet deadlines. Anticipates complex issues and discusses within and outside of project team to maintain open communication. Identifies test scenarios and/or cases, oversees test execution, and provides QA results to the business across a few projects, and assists with defining test strategies and testing methods, and conducts business risk assessment. Performs troubleshooting, assists on complex issues related to bugs in production systems or applications, and collaborates with business subject matter experts on issues. Assists and/or mentors other team members for training and performance management purposes, disseminates subject matter knowledge, and trains business on how to use tools. Level of Responsibility: Working under some supervision. Taking responsibility for own work and making decisions that are moderate in impact; errors may have relatively minor financial impact or effect on projects, operations, or customer relationships; errors may require involvement beyond immediate work group to correct. Using verbal and written communication skills to convey complex and/or detailed information to multiple individuals/audiences with differing knowledge levels. Role may require strong negotiation and influence, communication to large groups or high-level constituents. Having moderate amount of influence over key organizational decisions (e.g., is consulted by senior leadership to provide input on key decisions). Using deductive and inductive problem solving is required; multiple approaches may be taken/necessary to solve the problem; often information is missing or incomplete; intermediate data analysis/interpretation skills may be required. Exercising creativity to draft original documents, imagery, or work products within established guidelines. Minimum Qualifications: 4+ years of IT-relevant work experience with a Bachelor's degree. OR 6+ years of IT-relevant work experience without a Bachelors degree. Minimum Qualifications: Minimum 6-8 years of proven experience in testing, particularly in data engineering. Preferred Qualifications: Proven experience in testing, particularly in data engineering. 10+ years QA/testing experience. Strong coding skills in languages such as Python/ Java Proficiency in SQL and NoSQL databases. Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.
Posted 1 month ago
2.0 - 4.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills
Posted 1 month ago
2.0 - 4.0 years
9 - 14 Lacs
Hyderabad, Gurugram
Work from Office
Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills
Posted 1 month ago
10.0 - 15.0 years
4 - 7 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title:Power BI Experience:10-15Years Location:Chennai, Bangalore, Hyderabad : 1. Strong Power BI technical expertise with power automate and Paginated reports skillset[These skills are mandatory] 2.Strong Stakeholder management -given this role will also help us define requirement/ wireframes – working with business 3. Strong Delivery Management 4. Life insurance experience is desirable but not essential. A bachelor’s degree in information technology or related discipline with 10+ years of managingdelivery and operation of BI and analytics platforms and services, preferably in insurance or financial industry. Deep understanding of Power BI, Data visualization practices, and underlying data engineering and modelling to support reporting data layer preferably in Databricks or similar Experience in reporting within life insurance domain – covering claims, policy, underwriting is not mandatory but is highly valued. Proven experience in independently leading the technical delivery of a team of BI engineers, both onshore and offshore, while effectively managing delivery risks and issues. Excellent communication skills with the ability to convey technical concepts to both technical and non-technical stakeholders. Strong leadership skills, including the ability to mentor, coach, and develop a high-performing Business intelligence team. Migration experience from Cognos and Tableau to Power BI will be highly regarded. 5. To develop and guide the team members in enhancing their technical capabilities and increasing productivity to prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. To be responsible for providing technical guidance / solutions ;define, advocate, and implement best practices and coding standards for the team. To ensure process compliance in the assigned module| and participate in technical discussions/review as a technical consultant for feasibility study (technical alternatives, best packages, supporting architecture best practices, technical risks, breakdown into components, estimations). 6. Technical lead candidate who will be able to support and resolve any queries to the team members raised from the project for power automate and Paginated reports.
Posted 1 month ago
12.0 - 15.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Job Title ARCHITECT - AWS Databricks, SQL Experience 12-15 Years Location Bangalore : ARCHITECT, AWS, Databricks, SQL
Posted 1 month ago
12.0 - 20.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title Senior Software Engineer Experience 12-20 Years Location Bangalore : Strong knowledge & hands-on experience in AWS Data Bricks Nice to have Worked in hp eco system (FDL architecture) Technically strong to help the team on any technical issues they face during the execution. Owns the end-to-end technical deliverables Hands on data bricks + SQL knowledge Experience in AWS S3, Redshift, EC2 and Lambda services Extensive experience in developing and deploying Bigdata pipelines Experience in Azure data lake Strong hands on in SQL development / Azure SQL and in-depth understanding of optimization and tuning techniques in SQL with Redshift Development in Notebooks (like Jupyter, DataBricks, Zeppelin etc) Development experience in Spark Experience in scripting language like python and any other programming language Roles and Responsibilities Candidate must have hands on experience in AWS Data Databricks Good development experience using Python/Scala, Spark SQL and Data Frames Hands-on with Databricks, Data Lake and SQL knowledge is a must. Performance tuning, troubleshooting, and debugging SparkTM Process Skills: Agile – Scrum Qualification: Bachelor of Engineering (Computer background preferred)
Posted 1 month ago
10.0 - 12.0 years
9 - 13 Lacs
Chennai
Work from Office
Job Title Data Architect Experience 10-12 Years Location Chennai : 10-12 years experience as Data Architect Strong expertise in streaming data technologies like Apache Kafka, Flink, Spark Streaming, or Kinesis. ProficiencyinprogramminglanguagessuchasPython,Java,Scala,orGo ExperiencewithbigdatatoolslikeHadoop,Hive,anddatawarehousessuchas Snowflake,Redshift,Databricks,MicrosoftFabric. Proficiencyindatabasetechnologies(SQL,NoSQL,PostgreSQL,MongoDB,DynamoDB,YugabyteDB). Should be flexible to work as anIndividual contributor
Posted 1 month ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Experience Range : 4 - 12+ Year's Work Location : Bangalore (Proffered ) Must Have Skills : Airflow, big query, Hadoop, PySpark, Spark/Scala, Python, Spark - SQL, Snowflake, ETL, Data Modelling, Erwin OR Erwin Studio, Snowflake, Stored Procedure & Functions, AWS, Azure Databricks, Azure Data Factory. No Of Opening's : 10+ Job Description : We are having multiple Salesforce roles with our clients. Role 1 : Data Engineer Role 2 : Support Data Engineer Role 3 : ETL Support Engineer Role 4 : Senior Data Modeler Role 5 : Data Engineer Data Bricks Please find below the JD's for each role Role 1 : Data Engineer 5+ years of experience in data engineering or a related role. Proficiency in Apache Airflow for workflow scheduling and management. Strong experience with Hadoop ecosystems, including HDFS, MapReduce, and Hive. Expertise in Apache Spark/ Scala for large-scale data processing. Proficient in Python Advanced SQL skills for data analysis and reporting. Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) is a plus. Designs, proposes, builds, and maintains databases and datalakes, data pipelines that transform and model data, and reporting and analytics solutions Understands business problems and processes based on direct conversations with customers, can see the big picture, and translate that into specific solutions Identifies issues early, proposes solutions, and tactfully raises concerns and proposes solutions Participates in code peer reviews Articulates clearly pros/cons of various tools/approaches Documents and diagrams proposed solutions Role 2 : Support Data Engineer Prioritize and resolve Business-As-Usual (BAU) support queries within agreed Service Level Agreements (SLA) while ensuring application stability. Drive engineering delivery to reduce technical debt across the production environment, collaborating with development and infrastructure teams Perform technical analysis of the production platform to identify and address performance and resiliency issues Participate in the Software Development Lifecycle (SDLC) to improve production standards and controls Build and maintain the support knowledge database, updating the application runbook with known tasks and managing event monitoring Create health check monitors, dashboards, synthetic transactions and alerts to increase monitoring and observability of systems at scale. Participate in on-call rotation supporting application release validation, alert response, and incident management Collaborate with development, product, and customer success teams to identify and resolve technical problems. Research and implement recommendations from post-mortem analyses for continuous improvement. Document issue details and solutions in our ticketing system (JIRA and ServiceNow) Assist in creating and maintaining technical documentation, runbooks, and knowledge base articles Navigate a complex system, requiring deep troubleshooting/debugging skills and an ability to manage multiple contexts efficiently. Oversee the collection, storage, and maintenance of production data, ensuring its accuracy and availability for analysis. Monitor data pipelines and production systems to ensure smooth operation and quickly address any issues that arise. Implement and maintain data quality standards, conducting regular checks to ensure data integrity. Identify and resolve technical issues related to data processing and production systems. Work closely with data engineers, analysts, and other stakeholders to optimize data workflows and improve production efficiency. Contribute to continuous improvement initiatives by analyzing data to identify areas for process optimization Role 3 : ETL Support Engineer 6+ years of experience with ETL support and development ETL Tools: Experience with popular ETL tools like Talend, Microsoft SSIS, Experience with relational databases (e.g., SQL Server, Postgres). Experience with Snowflake Dataware house. Proficiency in writing complex SQL queries for data validation, comparison, and manipulation Familiarity with version control systems like Git, Github to manage changes in test cases and scripts. Knowledge of defect tracking tools like JIRA, ServiceNow. Banking domain experience is a must. Understanding of the ETL process Perform functional, Integration and Regression testing for ETL Processes. Validate and ensure data quality and consistency across different data sources and targets. Develop and execute test cases for ETL workflows and data pipeline. Load Testing: Ensuring that the data warehouse can handle the volume of data being loaded and queried under normal and peak conditions. Scalability: Testing for the scalability of the data warehouse in terms of data growth and system performance. Role 4 : Senior Data Modeler 7+ experience in metadata management, data modelling, and related tools (Erwin or ER Studio or others). Overall 10+ Experience in IT. Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional data platform technologies, and ETL and data ingestion). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Communication, and presentation skills. Help team to Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies develop the conceptual/logical/physical data models Define and govern data modelling and design standards, tools, best practices, and related development for enterprise data models. Hands-on modelling in modelling and mappings between source system data model and Datawarehouse data models. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks with respect to modelling and mappings. Hands on experience in writing complex SQL queries. Good to have experience in data modelling for NOSQL objects Role 5 : Data Engineer Data Bricks Design and build data pipelines using Spark-SQL and PySpark in Azure Databricks Design and build ETL pipelines using ADF Build and maintain a Lakehouse architecture in ADLS / Databricks. Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Work with DevOps team to deploy solutions in production environments. Control data processes and take corrective action when errors are identified. Corrective action may include executing a work around process and then identifying the cause and solution for data errors. Participate as a full member of the global Analytics team, providing solutions for and insights into data related items. Collaborate with your Data Science and Business Intelligence colleagues across the world to share key learnings, leverage ideas and solutions and to propagate best practices. You will lead projects that include other team members and participate in projects led by other team members. Apply change management tools including training, communication and documentation to manage upgrades, changes and data migrations.
Posted 1 month ago
12.0 - 16.0 years
18 - 25 Lacs
Hyderabad
Remote
JD for Fullstack Developer. Exp: 10+yrs Front End Development Design and implement intuitive and responsive user interfaces using React.js or similar front-end technologies. Collaborate with stakeholders to create a seamless user experience. Create mockups and UI prototypes for quick turnaround using Figma, Canva, or similar tools. Strong proficiency in HTML, CSS, JavaScript, and React.js. Experience with styling and graph libraries such as Highcharts, Material UI, and Tailwind CSS. Solid understanding of React fundamentals, including Routing, Virtual DOM, and Higher-Order Components (HOC). Knowledge of REST API integration. Understanding of Node.js is a big advantage. Middleware Development Experience with REST API development, preferably using FastAPI. Proficiency in programming languages like Python. Integrate APIs and services between front-end and back-end systems. Experience with Docker and containerized applications. Back End Development Experience with orchestration tools such as Apache Airflow or similar. Design, develop, and manage simple data pipelines using Databricks, PySpark, and Google BigQuery. Medium-level expertise in SQL. Basic understanding of authentication methods such as JWT and OAuth. Bonus Skills Experience working with cloud platforms such as AWS, GCP, or Azure. Familiarity with Google BigQuery and Google APIs. Hands-on experience with Kubernetes for container orchestration. Contact : Sandeep Nunna Ph No : 9493883212 Email : sandeep.nunna@clonutsolutions,com
Posted 1 month ago
3.0 - 13.0 years
3 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are looking for a skilled .Net Core Developer to join our team. The ideal candidate should have strong expertise in .Net Core, Azure, and Investran , along with experience in building scalable and high-performance applications. Key Responsibilities: Develop, maintain, and optimize applications using .Net Core and related technologies. Work with Azure Cloud and Azure Data Bricks to build cloud-based solutions. Implement and manage solutions using Investran . Develop applications using VB.NET, Entity Framework, LINQ, and .NET MVC . Apply design patterns to create efficient and maintainable code. Collaborate with cross-functional teams to design, develop, and deploy software solutions. Troubleshoot and debug applications to ensure optimal performance. Required Skills & Qualifications: Proficiency in .Net Core and .Net MVC . Experience with Azure Cloud services and Azure Data Bricks . Hands-on experience with Investran . Strong knowledge of VB.NET, Entity Framework, and LINQ . Understanding of design patterns and best coding practices. Experience in developing and optimizing high-performance applications. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with cloud-based deployments and containerization . Familiarity with Agile development methodologies. If you are passionate about building innovative software solutions and want to be part of a dynamic team, we'd love to hear from you!
Posted 1 month ago
12.0 - 17.0 years
25 - 35 Lacs
Pune, Chennai
Hybrid
• Should have led at least 3 large legacy EDW/data platform modernization & migrations to snowflake/databricks/data on cloud engagements in the last 5+ years. • Having experience in leading all aspects of the project/program life cycle, including
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
3.0 - 6.0 years
12 - 22 Lacs
Gurugram
Work from Office
Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Data Engineer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. Together with the dedicated Frontend– & BI Developer you will be responsible for managing and maintaining the Databricks based BI Platform including the processes from data model changes, implementation and development of pipelines are part of the daily focus, but ETL will get most of your attention. Continuous development to do better will need the ability to think bigger and work closely with the whole team. The Data Engineer (ETL Specialist) will collaborate with the Frontend– & BI Developer to align on possibilities to improve the BI Platform deliverables specifically for the CEP organization. Cooperation with other departments such as integrations or specific IT/IS projects and business specialists is part of the job. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to develop the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Maintain & Develop the Databricks workspace used to host the BI CEP solution Active in advising needed changes the data model to accommodate new BI requirements Develop and implement new ETL scripts and improve the current ones Ownership on resolving the incoming tickets for both incidents and requests Plan activities to stay close to the Frontend- & BI Developer to foresee coming changes to the backend Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications At least 3+ years of experience in Data Engineering Understanding of designing and implementing data processing architectures in Azure environments Experience with different SSAS - modelling techniques (preferable Azure, databricks - Microsoft related) Understanding of data management and – treatment to secure data governance & security (Platform management and administration) An analytical mindset with clear communication and problem-solving skills Experience in working with SCRUM set up Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Business Acumen & Commercial Awareness Working Conditions: With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law.
Posted 1 month ago
5.0 - 9.0 years
25 - 40 Lacs
Pune
Work from Office
Position Summary: As a member of Redaptive's AI team, you will be driving Agentic AI and Generative AI integration across all of Redaptives business units. You will drive AI development and integration across the organization, directly impacting Redaptives global sustainability efforts and shaping how we leverage AI to serve Fortune 500 clients. Responsibilities and Duties: Strategic Leadership (10%): Champion the AI/ML roadmap, driving strategic planning and execution for all initiatives. Provide guidance on data science projects (Agentic AI, Generative AI, and Machine Learning), aligning them with business objectives and best practices. Foster a data-driven culture, advocating for AI-powered solutions to business challenges and efficiency improvements. Collaborate with product management, engineering, and business stakeholders to identify opportunities and deliver impactful solutions Technical Leadership (40%): Architect and develop Proof-of-Concept (POC) solutions for Agentic AI, Generative AI, and ML. Utilize Python and relevant data science libraries, leveraging MLflow. Provide technical guidance on AI projects, ensuring alignment with business objectives and best practices. Assist in developmentand documentation of standards for ethical and regulatory-compliant AI usage. Stay current with AI advancements, contributing to the team's knowledge and expertise. Perform hands-on data wrangling and AI model development Operational Leadership (50%): Drive continuous improvement through Agentic AI, Generative AI, and predictive modeling. Participate in Agile development processes (Scrum and Kanban). Ensure compliance with regulatory and ethical AI standards. Other duties as assigned Required Abilities and Skills: Agentic AI development and deployment. Statistical modeling, machine learning algorithms, and data mining techniques. Databricks and MLflow for model training, deployment, and management on AWS. Working with large datasets on AWS and Databricks Strong hands-on experience with: Agentic AI development and deployment. Working with large datasets on AWS and Databricks. Desired Experience: Statistical modeling, machine learning algorithms, and data mining techniques. Databricks and MLflow for model training, deployment, and management on AWS. Experience integrating AI with IoT/event data. Experience with real-time and batch inference integration with SaaS applications. International team management experience. Track record of successful product launches in regulated environments Education and Experience: 5+ years of data science/AI experience Bachelor's degree in Statistics, Data Science, Computer Engineering, Mathematics, or a related field (Master's preferred). Proven track record of deploying successful Agentic AI, Generative AI, and ML projects from concept to production. Excellent communication skills, able to explain complex technical concepts to both technical and non-technical audiences.
Posted 1 month ago
8.0 - 10.0 years
10 - 14 Lacs
Pune
Work from Office
Salary 20 - 28 LPA About The Role - Mandatory Skills AWS Architect, AWS Glue or Databricks, PySpark, and Python - Hands-on experience with AWS Glue or Databricks, PySpark, and Python. - Minimum of 2 years of hands-on expertise in PySpark, including Spark job performance optimization techniques. - Minimum of 2 years of hands-on involvement with AWS Cloud - Hands on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch - Proficiency in crafting low-level designs for data warehousing solutions on AWS cloud. - Proven track record of implementing big-data solutions within the AWS ecosystem including Data Lakes. - Familiarity with data warehousing, data quality assurance, and monitoring practices. - Demonstrated capability in constructing scalable data pipelines and ETL processes. - Proficiency in testing methodologies and validating data pipelines. - Experience with or working knowledge of DevOps environments. - Practical experience in Data security services. - Understanding of data modeling, integration, and design principles. - Strong communication and analytical skills. - A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail. - Solution Design Collaborate with clients and stakeholders to understand business requirements and translate them into cloud-based solutions utilizing AWS services (EC2, Lambda, S3, RDS, VPC, IAM, etc.). - Architecture and Implementation Design and implement secure, scalable, and high-performance cloud solutions, ensuring alignment with AWS best practices and architectural principles. - Cloud Migration Assist with the migration of on-premise applications to AWS, ensuring minimal disruption and maximum efficiency. - Technical Leadership Provide technical leadership and guidance to development teams to ensure adherence to architecture standards and best practices. - Optimization Continuously evaluate and optimize AWS environments for cost, performance, and security. - Security Ensure the cloud architecture adheres to industry standards and security policies, using tools like AWS Identity and Access Management (IAM), AWS Key Management Service (KMS), and encryption protocols. - Documentation & Reporting Create clear technical documentation to define architectural decisions, solution designs, and cloud configurations. - Stakeholder Collaboration Work with cross-functional teams including developers, DevOps, QA, and business teams to align technical solutions with business goals. - Continuous Learning Stay updated with the latest AWS services, tools, and industry trends to ensure the implementation of cutting-edge solutions. - Strong understanding of AWS cloud services and architecture. - Hands-on experience with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or AWS CDK. - Knowledge of networking, security, and database services within AWS (e.g., VPC, IAM, RDS, and S3). - Familiarity with containerization and orchestration using AWS services like ECS, EKS, or Fargate. - Proficiency in scripting languages (e.g., Python, Shell, or Node.js). - Familiarity with CI/CD tools and practices in AWS environments (e.g., CodePipeline, Jenkins, etc.). Soft Skills : Communication Skills : - Clear and Concise Communication Ability to articulate complex technical concepts in simple terms for both technical and non-technical stakeholders. - Active Listening Ability to listen to business and technical requirements from stakeholders to ensure the proposed solution meets their needs. - Documentation Skills Ability to document technical designs, solutions, and architectural decisions in a clear and well-organized manner. Leadership and Team Collaboration : - Mentoring and Coaching Ability to mentor junior engineers, providing guidance and fostering professional growth. - Cross-functional Teamwork Collaborating effectively with various teams such as developers, DevOps, QA, business analysts, and security specialists to deliver integrated cloud solutions. - Conflict Resolution Addressing and resolving conflicts within teams and stakeholders to ensure smooth project execution. Problem-Solving and Critical Thinking : - Analytical Thinking Ability to break down complex problems and develop logical, scalable, and cost-effective solutions. - Creative Thinking Think outside the box to design innovative solutions that maximize the value of AWS technologies. - Troubleshooting Skills Quickly identifying root causes of issues and finding solutions to mitigate them. Adaptability and Flexibility : - Handling Change Ability to adapt to evolving requirements, technologies, and business needs. Cloud technologies and customer requirements change quickly. - Resilience Ability to deal with challenges and setbacks while maintaining a positive attitude and focus on delivering results. Stakeholder Management : - Client-facing Skills Ability to manage client relationships, understand their business needs, and translate those needs into cloud solutions. - Negotiation Skills Negotiating technical aspects of projects with clients or business units to balance scope, resources, and timelines. - Expectation Management Ability to set and manage expectations regarding timelines, deliverables, and technical feasibility. Decision-Making : - Sound Judgment Making well-informed and balanced decisions that consider both technical feasibility and business impact. - Risk Management Ability to assess risks in terms of cost, security, and performance and make decisions that minimize potential issues Preferred Skills : - Familiarity with DevOps practices and tools (e.g., Jenkins, Docker, Kubernetes). - Experience with serverless architectures using AWS Lambda, API Gateway, and DynamoDB. - Exposure to multi-cloud architectures (AWS, Azure, Google Cloud). Why Join Us - Competitive salary and benefits. - Opportunity to work on cutting-edge cloud technologies. - A dynamic work environment where innovation is encouraged. - Strong focus on professional development and career growth. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure Synapse Analytics - Query data in Azure Synapse Analytics Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
8.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
7.0 - 10.0 years
1 - 5 Lacs
Pune
Work from Office
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
9.0 - 13.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Experience 8+ years Location Knowledge City, Hyderabad Work Model : Hybrid Regular work hours No. of rounds 1 internal technical round & client 2 rounds About You : The GCP CloudOps Engineer is accountable for a continuous, repeatable, secure, and automated deployment, integration, and test solutions utilizing Infrastructure as Code (IaC) and DevSecOps techniques. - 8+ years of hands-on experience in infrastructure design, implementation, and delivery - 3+ years of hands-on experience with monitoring tools (Datadog, New Relic, or Splunk) - 4+ years of hands-on experience with Container orchestration services, including Docker or Kubernetes, GKE. - Experience with working across time zones and with different cultures. - 5+ years of hands-on experience in Cloud technologies GCP is preferred. - Maintain an outstanding level of documentation, including principles, standards, practices, and project plans. - Having experience building a data warehouse using Databricks is a huge plus. - Hands-on experience with IaC patterns and practices and related automation tools such as Terraform, Jenkins, Spinnaker, CircleCI, etc., built automation and tools using Python, Go, Java, or Ruby. - Deep knowledge of CICD processes, tools, and platforms like GitHub workflows and Azure DevOps. - Proactive collaborator and can work in cross-team initiatives with excellent written and verbal communication skills. - Experience with automating long-term solutions to problems rather than applying a quick fix. - Extensive knowledge of improving platform observability and implementing optimizations to monitoring and alerting tools. - Experience measuring and modeling cost and performance metrics of cloud services and establishing a vision backed by data. - Develop tools and CI/CD framework to make it easier for teams to build, configure, and deploy applications - Contribute to Cloud strategy discussions and decisions on overall Cloud design and best approach for implementing Cloud solutions - Follow and Develop standards and procedures for all aspects of a Digital Platform in the Cloud - Identify system enhancements and automation opportunities for installing/maintaining digital platforms - Adhere to best practices on Incident, Problem, and Change management - Implementing automated procedures to handle issues and alerts proactively - Experience with debugging applications and a deep understanding of deployment architectures. Pluses : - Databricks - Experience with the Multicloud environment (GCP, AWS, Azure), GCP is the preferred cloud provider. - Experience with GitHub and GitHub Actions Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
6.0 - 10.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Role Big Data Engineer Work Location Bangalore (CV Ramen Nagar location) Experience 7+ Years Notice Period Immediate - 30 days Mandatory Skills Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane