Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Job Description: As a Software Engineer at Elevance Health, you will be responsible for designing and implementing data storage solutions using Snowflake's cloud infrastructure. Your primary focus will be on PLSQL development, including strong knowledge of Procedures, Exceptions, advanced SQL, Query tuning, and proficiency in Python programming. Additionally, you will be expected to optimize SQL queries for performance and scalability, as well as have a good understanding of Snowflake tools such as Snowpipe, Time travel, Streams & Tasks. You will play a key role in the development and unit testing process before handing over to QA, ensuring code quality and performance. Agile knowledge and excellent communication skills are highly valued in this role. Moreover, a good understanding of databases, specifically Oracle and Snowflake, is required. Familiarity with job scheduling tools like Control M for scheduling and debugging jobs is a plus. Having knowledge of the healthcare domain, especially in the Provider area, is an advantage. Experience with SDLC methodologies like Waterfall and Agile is preferred, along with the ability to work independently and collaborate effectively with global teams. Strong analytical and problem-solving skills are essential for addressing production failures and data quality issues. Your commitment, accountability, and communication skills will be crucial in this role, as you work on integrating Snowflake with various sources and ensuring business communication is effective. Elevance Health offers a world of limitless opportunities, emphasizing continuous learning and development, holistic well-being, and a supportive work environment. If you are a dedicated software engineer with a passion for innovative technologies and a desire to make a difference in healthcare operations, we invite you to join our team at Elevance Health and contribute to our mission of Improving Lives and Communities while Simplifying Healthcare.,
Posted 2 days ago
4.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a developer of digital futures at Tietoevry, a leading technology company with a strong Nordic heritage and global capabilities. With core values of openness, trust, and diversity, you collaborate with customers to create digital futures where businesses, societies, and humanity thrive. The company's 24,000 experts specialize in cloud, data, and software, serving enterprise and public-sector customers in around 90 countries. Tietoevry's annual turnover is approximately EUR 3 billion, and its shares are listed on the NASDAQ exchange in Helsinki, Stockholm, and Oslo Brs. In the USA, EVRY USA delivers IT services through global delivery centers and offices in India (EVRY India). The company offers a comprehensive IT services portfolio, driving digital transformation across sectors like Banking & Financial Services, Insurance, Healthcare, Retail & Logistics, and Energy, Utilities & Manufacturing. EVRY India's process and project maturity are high, with offshore development centers in India appraised at CMMI DEV Maturity Level 5 & CMMI SVC Maturity Level 5 and certified under ISO 9001:2015 & ISO/IEC 27001:2013. As a Senior Data Modeler, you will lead the design and development of enterprise-grade data models for a modern cloud data platform built on Snowflake and Azure. With a strong foundation in data modeling best practices and hands-on experience with the Medallion Architecture, you will ensure data structures are scalable, reusable, and aligned with business and regulatory requirements. You will work on data models that meet processing, analytics, and reporting needs, focusing on Snowflake data warehousing and Medallion Architecture's Bronze, Silver, and Gold layers. Collaborating with various stakeholders, you will translate business needs into scalable data models, drive data model governance, and ensure compliance with data governance, quality, and security requirements. **Pre-requisites:** - 10 years of experience in data modeling, data architecture, or data engineering roles. - 4 years of experience modeling data in Snowflake or other cloud data warehouses. - Strong understanding and hands-on experience with Medallion Architecture and modern data platform design. - Experience using data modeling tools (Erwin etc.). - Proficiency in data modeling techniques: 3NF, dimensional modeling, data vault, and star/snowflake schemas. - Expert-level SQL and experience working with semi-structured data (JSON, XML). - Familiarity with Azure data services (ADF, ADLS, Synapse, Purview). **Key Responsibilities:** - Design, develop, and maintain data models for Snowflake data warehousing. - Lead the design and implementation of logical, physical, and canonical data models. - Architect data models for Bronze, Silver, and Gold layers following the Medallion Architecture. - Collaborate with stakeholders to translate business needs into scalable data models. - Drive data model governance and compliance with data requirements. - Conduct data profiling, gap analysis, and data integration efforts. - Support time travel kind of reporting and build models for operational & analytical reports. Recruiter Information: - Recruiter Name: Harish Gotur - Recruiter Email Id: harish.gotur@tietoevry.com,
Posted 2 days ago
7.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer with 7-12 years of experience, you will be an integral part of our team, contributing significantly to the design, development, and maintenance of our data infrastructure. Your primary responsibilities will revolve around creating and managing robust data architectures, ETL processes, data warehouses, and utilizing big data and cloud technologies to support our business intelligence and analytics needs. You will lead the design and implementation of data architectures that facilitate data warehousing, integration, and analytics platforms. Developing and optimizing ETL pipelines will be a key aspect of your role, ensuring efficient processing of large datasets and implementing data transformation and cleansing processes to maintain data quality. Your expertise will be crucial in building and maintaining scalable data warehouse solutions using technologies such as Snowflake, Databricks, or Redshift. Additionally, you will leverage AWS Glue and PySpark for large-scale data processing, manage data pipelines with Apache Airflow, and utilize cloud platforms like AWS, Azure, and GCP for data storage, processing, and analytics. Establishing data governance and security best practices, ensuring data integrity, accuracy, and availability, and implementing monitoring and alerting systems are vital components of your responsibilities. Collaborating closely with stakeholders, mentoring junior engineers, and leading data-related projects will also be part of your role. Furthermore, your technical skills should include proficiency in ETL tools like Informatica Power Center, Python, PySpark, SQL, RDBMS platforms, and data warehousing concepts. Soft skills such as excellent communication, leadership, problem-solving, and the ability to manage multiple projects effectively will be essential for success in this role. Preferred qualifications include experience with machine learning workflows, certification in relevant data engineering technologies, and familiarity with Agile methodologies and DevOps practices. Location: Hyderabad Employment Type: Full-time,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
The ideal candidate for the Data Scientist position at TransOrg Analytics should have a Bachelor's degree in Computer Science, Engineering, Statistics, Mathematics, or a related quantitative field. You should possess 3-5 years of relevant experience in data analysis or a related role, along with expertise in data pre-processing and manipulation using Python or SQL. Experience in statistical analysis and modeling techniques such as regression, classification, and clustering is essential. Candidates with experience in machine learning, deep learning frameworks, and libraries will be preferred. Proficiency in data processing tools and frameworks, as well as data visualization tools like Tableau or Power BI, is required. You should have a proven track record of managing data project delivery, meeting deadlines, managing stakeholder expectations, and producing clear deliverables. Strong problem-solving skills and attention to detail are crucial for this role, along with the ability to think critically and provide data-backed insights. Excellent communication skills, both verbal and written, are a must. Familiarity with Cloud Platforms like Azure, AWS, or GCP, and the ability to use them for developing, training, and testing deep learning models will be an added advantage. Knowledge of cloud-based data warehousing platforms, particularly Snowflake, is also beneficial. If you are passionate about leveraging data science to streamline, optimize, and accelerate businesses, and meet the above requirements, we invite you to join our team at TransOrg Analytics. Visit www.transorg.com to learn more about us and explore how you can contribute to our mission of providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC, and the Middle East.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are developers of digital futures! Tietoevry creates purposeful technology that reinvents the world for good. We are a leading technology company with a strong Nordic heritage and global capabilities. Based on our core values of openness, trust, and diversity, we work with our customers to develop digital futures where businesses, societies, and humanity thrive. Our 24,000 experts globally specialize in cloud, data, and software, serving thousands of enterprise and public-sector customers in approximately 90 countries. Tietoevry's annual turnover is approximately EUR 3 billion, and the company's shares are listed on the NASDAQ exchange in Helsinki and Stockholm, as well as on Oslo Brs. EVRY USA delivers IT services to a wide range of customers in the USA through its global delivery centers and India offices (EVRY India) in Bangalore & Chandigarh, India. We offer a comprehensive IT services portfolio and drive digital transformation across various sectors including Banking & Financial Services, Insurance, Healthcare, Retail & Logistics, and Energy, Utilities & Manufacturing. EVRY India's process and project maturity is very high, with the two offshore development centers in India being appraised at CMMI DEV Maturity Level 5 & CMMI SVC Maturity Level 5 and certified under ISO 9001:2015 & ISO/IEC 27001:2013. We are seeking a highly experienced Snowflake Architect with deep expertise in building scalable data platforms on Azure, applying Medallion Architecture principles. The ideal candidate should have strong experience working in the Banking domain. The candidate will play a key role in architecting secure, performant, and compliant data solutions to support business intelligence, risk, compliance, and analytics initiatives. **Pre-requisites:** - 5 years of hands-on experience in Snowflake including schema design, security setup, and performance tuning. - Implementation experience using Snowpark. - Must have a Data Architecture background. - Deployed a fully operational data solution into production on Snowflake & Azure. - Snowflake certification preferred. - Familiarity with data modeling practices like dimensional modeling & data vault. - Understanding of the dbt tool. **Key Responsibilities:** - Design and implement scalable and performant data platforms using Snowflake on Azure, tailored for banking industry use cases. - Architect ingestion, transformation, and consumption layers using Medallion Architecture for a performant & scalable data platform. - Work with data engineers to build modular and reusable bronze, silver and gold layer models that support diverse workloads. - Provide architectural oversight and best practices to ensure scalability, performance, and maintainability. - Collaborate with stakeholders from risk, compliance, and analytics teams to translate requirements into data-driven solutions. - Build architecture to support time travel kind of reporting. - Support CI/CD automation and environment management using tools like Azure DevOps and Git. - Build architecture to support operational & analytical reports. Recruiter Information: - Recruiter Name: Harish Gotur - Recruiter Email Id: harish.gotur@tietoevry.com,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
You are looking for a highly skilled and proactive Snowflake & StreamSets Platform Administrator to support and enhance enterprise data engineering and analytics platforms. In this role, you will be responsible for managing large-scale Snowflake data warehouses and StreamSets data pipelines. Your expertise should include strong troubleshooting, automation, and monitoring capabilities. Your primary focus will be on ensuring platform reliability, performance, security, and compliance. Collaboration with cross-functional teams such as data engineers, DevOps, and support teams will be essential. The ideal candidate should have at least 5 years of experience in this field and be comfortable working in a Full-time role with 24x7 Support (Rotational Shifts). The position can be based in Pune, Hyderabad, Noida, or Bengaluru. If you are a detail-oriented individual with a passion for data management and platform administration, and possess the necessary technical skills, this opportunity could be a perfect fit for you. Join our team at Birlasoft Office in Noida, India, and contribute to the success of our data engineering and analytics initiatives.,
Posted 2 days ago
15.0 - 19.0 years
0 Lacs
hyderabad, telangana
On-site
As a Technical Lead / Data Architect, you will play a crucial role in our organization by leveraging your expertise in modern data architectures, cloud platforms, and analytics technologies. In this leadership position, you will be responsible for designing robust data solutions, guiding engineering teams, and ensuring successful project execution in collaboration with the project manager. Your key responsibilities will include architecting and designing end-to-end data solutions across multi-cloud environments such as AWS, Azure, and GCP. You will lead and mentor a team of data engineers, BI developers, and analysts to deliver on complex project deliverables. Additionally, you will define and enforce best practices in data engineering, data warehousing, and business intelligence. You will design scalable data pipelines using tools like Snowflake, dbt, Apache Spark, and Airflow, and act as a technical liaison with clients, providing strategic recommendations and maintaining strong relationships. To be successful in this role, you should have at least 15 years of experience in IT with a focus on data architecture, engineering, and cloud-based analytics. You must have expertise in multi-cloud environments and cloud-native technologies, along with deep knowledge of Snowflake, Data Warehousing, ETL/ELT pipelines, and BI platforms. Strong leadership and mentoring skills are essential, as well as excellent communication and interpersonal abilities to engage with both technical and non-technical stakeholders. In addition to the required qualifications, certifications in major cloud platforms and experience in enterprise data governance, security, and compliance are preferred. Familiarity with AI/ML pipeline integration would be a plus. We offer a collaborative work environment, opportunities to work with cutting-edge technologies and global clients, competitive salary and benefits, and continuous learning and professional development opportunities. Join us in driving innovation and excellence in data architecture and analytics.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As a Digital Product Engineering company, Nagarro is seeking a talented individual to join our dynamic and non-hierarchical work culture as a Data Engineer. With over 17500 experts across 39 countries, we are scaling in a big way and are looking for someone with 10+ years of total experience to contribute to our team. **Requirements:** - The ideal candidate should possess strong working experience in Data Engineering and Big Data platforms. - Hands-on experience with Python and PySpark is required. - Expertise with AWS Glue, including Crawlers and Data Catalog, is essential. - Experience with Snowflake and a strong understanding of AWS services such as S3, Lambda, Athena, SNS, and Secrets Manager are necessary. - Familiarity with Infrastructure-as-Code (IaC) tools like CloudFormation and Terraform is preferred. - Strong experience with CI/CD pipelines, preferably using GitHub Actions, is a plus. - Working knowledge of Agile methodologies, JIRA, and GitHub version control is expected. - Exposure to data quality frameworks, observability, and data governance tools and practices is advantageous. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams are essential for this role. **Responsibilities:** - Writing and reviewing high-quality code to meet technical requirements. - Understanding clients" business use cases and converting them into technical designs. - Identifying and evaluating different solutions to meet clients" requirements. - Defining guidelines and benchmarks for Non-Functional Requirements (NFRs) during project implementation. - Developing design documents explaining the architecture, framework, and high-level design of applications. - Reviewing architecture and design aspects such as extensibility, scalability, security, design patterns, user experience, and NFRs. - Designing overall solutions for defined functional and non-functional requirements and defining technologies, patterns, and frameworks. - Relating technology integration scenarios and applying learnings in projects. - Resolving issues raised during code/review through systematic analysis of the root cause. - Conducting Proof of Concepts (POCs) to ensure suggested designs/technologies meet requirements. **Qualifications:** - Bachelors or master's degree in computer science, Information Technology, or a related field is required. If you are passionate about Data Engineering, experienced in working with Big Data platforms, proficient in Python and PySpark, and have a strong understanding of AWS services and Infrastructure-as-Code tools, we invite you to join Nagarro and be part of our innovative team.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer specializing in Snowflake Migration at Anblicks, you will be a key player in our Data Modernization Center of Excellence (COE). You will be at the forefront of transforming traditional data platforms by utilizing Snowflake, cloud-native tools, and intelligent automation to help enterprises unlock the power of the cloud. Your primary responsibility will be to lead the migration of legacy data warehouses such as Teradata, Netezza, Oracle, or SQL Server to Snowflake. You will re-engineer and modernize ETL pipelines using cloud-native tools and frameworks like DBT, Snowflake Tasks, Streams, and Snowpark. Additionally, you will design robust ELT pipelines on Snowflake that ensure high performance, scalability, and cost optimization, while integrating Snowflake with AWS, Azure, or GCP. In this role, you will also focus on implementing secure and compliant architectures with RBAC, masking policies, Unity Catalog, and SSO. Automation of repeatable tasks, ensuring data quality and parity between source and target systems, and mentoring junior engineers will be essential aspects of your responsibilities. Collaboration with client stakeholders, architects, and delivery teams to define migration strategies, as well as presenting solutions and roadmaps to technical and business leaders, will also be part of your role. To qualify for this position, you should have at least 6 years of experience in Data Engineering or Data Warehousing, with a minimum of 3 years of hands-on experience in Snowflake design and development. Strong expertise in migrating ETL pipelines from Talend and/or Informatica to cloud-native alternatives, proficiency in SQL, data modeling, ELT design, and pipeline performance tuning are prerequisites. Familiarity with tools like DBT Cloud, Airflow, Snowflake Tasks, or similar orchestrators, as well as a solid understanding of cloud data architecture, security frameworks, and data governance, are also required. Preferred qualifications include Snowflake certifications (SnowPro Core and/or SnowPro Advanced Architect), experience with custom migration tools, metadata-driven pipelines, or LLM-based code conversion, familiarity with domain-specific architectures in Retail, Healthcare, or Manufacturing, and prior experience in a COE or modernization-focused consulting environment. By joining Anblicks as a Lead Data Engineer, you will have the opportunity to lead enterprise-wide data modernization programs, tackle complex real-world challenges, and work alongside certified Snowflake architects, cloud engineers, and innovation teams. You will also have the chance to build reusable IP that scales across clients and industries, while experiencing accelerated career growth in the dynamic Data & AI landscape.,
Posted 2 days ago
4.0 - 8.0 years
0 - 0 Lacs
gurugram
On-site
Position Overview We are seeking a highly skilled and experienced Senior Tableau Developer to join our dynamic team in Gurugram . This is an exciting opportunity for a professional who is passionate about data visualization and analytics. The ideal candidate will have a strong background in Tableau development, along with expertise in SQL and Snowflake. As a Senior Tableau Developer, you will play a crucial role in transforming data into actionable insights that drive business decisions. Key Responsibilities Design, develop, and maintain interactive dashboards and reports using Tableau Desktop. Collaborate with cross-functional teams to gather requirements and understand business needs. Utilize SQL to extract, manipulate, and analyze data from various sources. Implement best practices for data visualization and ensure high-quality deliverables. Optimize Tableau performance and troubleshoot any issues that arise. Work with Snowflake to manage and query large datasets efficiently. Provide training and support to junior developers and stakeholders on Tableau functionalities. Stay updated with the latest trends and advancements in data visualization and analytics. Qualifications The ideal candidate will possess the following qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 4 to 8 years of relevant work experience in Tableau development. Strong proficiency in Tableau Desktop and Tableau Server. Hands-on experience with SQL and Snowflake. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team in a fast-paced environment. Strong communication skills, with the ability to present complex data insights clearly. This is a full-time position with a day schedule and requires on-site work. We are looking to fill 1 position with a competitive annual salary of 18,00,000 . If you are a motivated individual with a passion for data and a desire to make an impact, we encourage you to apply and join our team!
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be working as an Outsystem or Snowflake Developer at KPMG in India, a professional services firm affiliated with KPMG International Limited. Established in India in August 1993, KPMG leverages a global network of firms and possesses in-depth knowledge of local laws, regulations, and markets. With offices in multiple cities across India, KPMG offers services to national and international clients across various sectors. As an Outsystem or Snowflake Developer, you will be responsible for developing and maintaining applications using Outsystem or Snowflake technologies. Your role will involve collaborating with the team to deliver high-quality solutions that meet client requirements. Additionally, you will contribute to the enhancement of technology-enabled services, leveraging your expertise in global and local industries. To qualify for this position, you should possess a Bachelor's degree or equivalent in a relevant field. Your ability to work effectively in a team, adapt to changing technology landscapes, and deliver innovative solutions will be crucial to your success in this role at KPMG in India. Join us at KPMG and be part of a dynamic team that values equal employment opportunities and encourages professional growth and development.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Cowbell is signaling a new era in cyber insurance by harnessing technology and data to provide small and medium-sized enterprises (SMEs) with advanced warning of cyber risk exposures bundled with cyber insurance coverage adaptable to the threats of today and tomorrow. Championing adaptive insurance, Cowbell follows policyholders" cyber risk exposures as they evolve through continuous risk assessment and continuous underwriting. In its unique AI-based approach to risk selection and pricing, Cowbell's underwriting platform, powered by Cowbell Factors, compresses the insurance process from submission to issue to less than 5 minutes. Founded in 2019 and based in the San Francisco Bay Area, Cowbell has rapidly grown, now operating across the U.S., Canada, U.K., and India. This growth was recently bolstered by a successful Series C fundraising round of $60 million from Zurich Insurance. This investment not only underscores the confidence in Cowbell's mission but also accelerates our capacity to revolutionize cyber insurance on a global scale. With the backing of over 25 prominent reinsurance partners, Cowbell is poised to redefine how SMEs navigate the evolving landscape of cyber threats. In support of business objectives, we are actively looking for an ambitious person, who is not afraid of hard-work and embraces ambiguity as it comes to join our Information Security Team as a Sr. Developer, Application Security. The InfoSec team drives security, privacy, and compliance improvements to reduce risk by building out key security programs. We enable our colleagues to keep the company secure and support our customers" security journey with tried and true best practices. We are a Java, Python, and React shop combined with world-class cloud infrastructure such as AWS & Snowflake. Balancing proper security while enabling execution speed for our colleagues is our ultimate goal. It's challenging and rewarding! If you are up for the challenge, come join us. You will be instrumental in curing security defects in code, burning down any new and existing vulnerabilities. You can fix the code yourself and continuous patching is your north star. You will be the champion for safeguards and standards that will keep our code secure and reduce the introduction of new vulnerabilities. Partner and collaborate with internal stakeholders in assisting with the overall security posture with an emphasis on the Engineering and Operations/IT areas. Work across engineering, product and business systems teams to enhance and evangelize security in applications (& infrastructure). Research emerging technologies and maintain awareness of current security risks in support of security enhancement and development efforts. Develop and maintain application scanning solutions to inform stakeholders of security weaknesses & vulnerabilities. Review outstanding vulnerabilities with product teams and assist in remediation efforts to reduce risk. Bachelor's degree in computer science or another STEM discipline and 8 to 10+ years of professional experience in security software development. Majority of prior experience as a Security Engineer focused on remediation of security vulnerabilities and defects in Java and Python. Must have prior in-depth demonstrable experience developing in JAVA and Python; Basically you are developer first and a security engineer second. Applicants that do not have this experience will not be considered. Experience developing in, and securing, Javascript and React a plus. Experience securing integrations and code that utilizes Elasticsearch, Snowflake, Databricks, RDS a big plus. Detail-oriented with problem-solving, communication, and analytical skills. Expert understanding of CVE and CVSS scoring and how to utilize this data for validation, prioritization, and remediation. Excellent understanding and utilization of OWASP. Demonstrated ability to secure API; Techniques, patterns, will be assessed. Experience designing and implementing application security solutions for web and or mobile applications. Experience developing and reporting vulnerability metrics as well as articulating how to reproduce and resolve those security defects. Experienced in application penetration testing; and understanding of remediation techniques for common misconfigurations and vulnerabilities. Demonstrable experience in understanding patching and library upgrade paths including interdependencies. Familiarity with CI/CD tools. Previous admin experience in CI/CD is not required but a big plus. Capability to deploy, provide maintenance for, and operationalize scanning solutions. Hands-on ability to conduct scans across application repositories and infrastructure. Must be willing to work extended hours and weekends as needed. Great at and enjoys documenting solutions; creating repeatable instruction for others, operational documentation, developing technical diagrams, and similar artifacts. Preferred Qualifications: You can demonstrate and document threat modeling scenarios using well-known frameworks such as STRIDE. Proficient with penetration testing tools such Burp suite, Metasploit or ZAP. You are already proficient with SAST & SCA tools; proficiency with DAST and/or OAST tool usage and techniques would be even better. As a mentor you also have the experience and desire in providing fellow engineering teams with technical guidance on the impact and priority of security issues and driving remediation. Capability to develop operational process from scratch or improve current processes and procedures through well-thought-out hand-offs, integrations, and automation. Familiarity with multiple security domains such as application security, infrastructure security, network security, incident response, and regulatory compliance and certifications. Understanding of modern endpoint security technologies/concepts. Adept at working with distributed team members. What Cowbell brings to the table: Employee equity plan for all and wealth enablement plan for select customer-facing roles. Comprehensive wellness program, meditation app subscriptions, lunch and learn, book club, happy hours, and much more. Professional development and the opportunity to learn the ins and outs of cyber insurance, cybersecurity as well as continuing to build your professional skills in a team environment. Equal Employment Opportunity: Cowbell is a leading innovator in cyber insurance, dedicated to empowering businesses to always deliver their intended outcomes as the cyber threat landscape evolves. Guided by our core values of TRUE Transparency, Resiliency, Urgency, and Empowerment, we are on a mission to be the gold standard for businesses to understand, manage, and transfer cyber risk. At Cowbell, we foster a collaborative and dynamic work environment where every employee is empowered to contribute and grow. We pride ourselves on our commitment to transparency and resilience, ensuring that we not only meet but exceed industry standards. We are proud to be an equal opportunity employer, promoting a diverse and inclusive workplace where all voices are heard and valued. Our employees enjoy competitive compensation, comprehensive benefits, and continuous opportunities for professional development.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Software Engineer specializing in Debezium, Snowflake, Business Objects, Power BI, Java/Python, and SQL, with 3 to 6 years of experience in Software Development/Engineering, you will be a crucial member of our team in either Bangalore or Hyderabad (Position ID: J1124-1679). In this permanent role, your primary responsibility will be the development and maintenance of our applications to ensure they are robust, user-friendly, and scalable. Your key duties and responsibilities will include designing, developing, and maintaining web applications utilizing technologies such as Debezium, Snowflake, Business Objects, Power BI, and Pentaho. You will collaborate with cross-functional teams to define, design, and implement new features, ensuring clean, scalable, and efficient code. Additionally, you will conduct code reviews, perform unit testing and continuous integration, as well as troubleshoot and resolve technical issues promptly. Staying abreast of emerging technologies and industry trends will be essential, and active participation in Agile/Scrum development processes is expected. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, coupled with at least 3 years of experience in full-stack development. Possessing analytical and multitasking skills will be advantageous, along with familiarity with tools like JIRA, Gitlab, and Confluence. Proficiency in database technologies such as SQL, MySQL, PostgreSQL, or NoSQL databases, as well as experience with version control systems like Git, is preferred. Knowledge of cloud services like AWS, Azure, or Google Cloud, understanding CI/CD pipelines, and DevOps practices will be beneficial. Soft skills are paramount in this role, with a strong emphasis on problem-solving, communication, collaboration, and the ability to thrive in a fast-paced, agile environment. The successful candidate will exhibit a strong work ethic and a commitment to turning insights into actionable solutions. At CGI, we prioritize ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to contribute from day one, shaping our collective success and actively participating in the company's strategy and direction. Your work will be valued and impactful, allowing you to innovate, build relationships, and leverage global capabilities. CGI offers a supportive environment for career growth, health, and well-being, providing opportunities to enhance your skills and broaden your horizons. Join our team at CGI, one of the world's largest IT and business consulting services firms, and embark on a fulfilling career journey with us.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Architect / Data Modeling Expert, you will be an essential part of our offshore team based in India, collaborating closely with Business Analysts and Technical Analysts. Your primary responsibilities will revolve around designing and implementing efficient data models in Snowflake, along with creating source-to-target mapping documents. Your expertise in data modeling principles, coupled with exposure to ETL tools, will play a crucial role in architecting databases and driving data modeling initiatives leading to AI solutions. Your key responsibilities will include: - Designing and implementing normalized and denormalized data models in Snowflake based on business and technical requirements. - Collaborating with Business Analysts/Technical Analysts to gather data needs and document requirements effectively. - Developing source-to-target mapping documents to ensure accurate data transformations. - Working on data ingestion, transformation, and integration pipelines using SQL and cloud-based tools. - Optimizing Snowflake queries, schema designs, and indexing for enhanced performance. - Maintaining clear documentation of data models, mappings, and data flow processes. - Ensuring data accuracy, consistency, and compliance with best practices in data governance and quality. You should possess: - 10+ years of experience in Data Modeling, Data Engineering, or related roles. - A strong understanding of data modeling concepts such as OLTP, OLAP, Star Schema, and Snowflake Schema. - Hands-on experience in Snowflake including schema design and query optimization. - The ability to create detailed source-to-target mapping documents. - Proficiency in SQL-based data transformations and queries. - Exposure to ETL tools, with familiarity in Matillion considered advantageous. - Strong problem-solving and analytical skills. - Excellent communication skills for effective collaboration with cross-functional teams. Preferred qualifications include experience in cloud-based data environments (AWS, Azure, or GCP), hands-on exposure to Matillion or other ETL tools, understanding of data governance and security best practices, and familiarity with Agile methodologies.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
delhi
On-site
The client, a leading MNC, specializes in technology consulting and digital solutions for global enterprises. With a vast workforce of over 145,000 professionals across 90+ countries, they cater to 1100+ clients in various industries. The company offers a comprehensive range of services including consulting, IT solutions, enterprise applications, business processes, engineering, network services, customer experience, AI & analytics, and cloud infrastructure services. Notably, they have been recognized for their commitment to sustainability with the Terra Carta Seal, showcasing their dedication to building a climate and nature-positive future. As a Data Engineer with a minimum of 6 years of experience, you will be responsible for constructing and managing data pipelines. The ideal candidate should possess expertise in Databricks, AWS/Azure, and data storage technologies such as databases and distributed file systems. Familiarity with the Spark framework is essential, and prior experience in the retail sector would be advantageous. Key Responsibilities: - Design, develop, and maintain scalable ETL pipelines for processing large data volumes from diverse sources. - Implement and oversee data integration solutions utilizing tools like Databricks, Snowflake, and other relevant technologies. - Develop and optimize data models and schemas to support analytical and reporting requirements. - Write efficient and sustainable Python code for data processing and transformations. - Utilize Apache Spark for distributed data processing and large-scale analytics. - Translate business needs into technical solutions. - Ensure data quality and integrity through rigorous unit testing. - Collaborate with cross-functional teams to integrate data pipelines with other systems. Technical Requirements: - Proficiency in Databricks for data integration and processing. - Experience with ETL tools and processes. - Strong Python programming skills with Apache Spark, emphasizing data processing and automation. - Solid SQL skills and familiarity with relational databases. - Understanding of data warehousing concepts and best practices. - Exposure to cloud platforms such as AWS and Azure. - Hands-on troubleshooting ability and problem-solving skills for complex data issues. - Practical experience with Snowflake.,
Posted 2 days ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at our Pune location, you will play a critical role in designing, developing, and maintaining scalable data pipelines and architectures using Data bricks on Azure/AWS cloud platforms. With 6 to 9 years of experience in the field, you will collaborate with stakeholders to integrate large datasets, optimize performance, implement ETL/ELT processes, ensure data governance, and work closely with cross-functional teams to deliver accurate solutions. Your responsibilities will include building, maintaining, and optimizing data workflows, integrating datasets from various sources, tuning pipelines for performance and scalability, implementing ETL/ELT processes using Spark and Data bricks, ensuring data governance, collaborating with different teams, documenting data pipelines, and developing automated processes for continuous integration and deployment of data solutions. To excel in this role, you should have 6 to 9 years of hands-on experience as a Data Engineer, expertise in Apache Spark, Delta Lake, Azure/AWS Data bricks, proficiency in Python, Scala, or Java, advanced SQL skills, experience with cloud data platforms, data warehousing solutions, data modeling, ETL tools, version control systems, and automation tools. Additionally, soft skills such as problem-solving, attention to detail, and ability to work in a fast-paced environment are essential. Nice to have skills include experience with Data bricks SQL and Data bricks Delta, knowledge of machine learning concepts, and experience in CI/CD pipelines for data engineering solutions. Joining our team offers challenging work with international clients, growth opportunities, a collaborative culture, and global project involvement. We provide competitive salaries, flexible work schedules, health insurance, performance-based bonuses, and other standard benefits. If you are passionate about data engineering, possess the required skills and qualifications, and thrive in a dynamic and innovative environment, we welcome you to apply for this exciting opportunity.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The successful candidate will be responsible for developing and maintaining applications using Python, SQL, Reactjs, and Java. You will also be involved in building and managing data pipelines on platforms such as Databricks, DBT, Snowflake, RSL (Report Specification Language-Geneva), and RDL (Report Definition Language). Your experience with non-functional aspects like performance management, scalability, and availability will be crucial. Additionally, you will collaborate closely with front-office, operations, and finance teams to enhance reporting and analysis for alternative investments. Working with cross-functional teams, you will drive automation, workflow efficiencies, and reporting enhancements. Troubleshooting system issues, implementing enhancements, and ensuring optimal system performance to follow the sun model for end-to-end coverage of applications will also be part of your responsibilities. Qualifications & Experience: - Education: Bachelor's degree in Computer Science, Engineering, or a related field. - Experience: Minimum of 2 years of experience in enterprise software development and production management, preferably within financial services. - Proficiency in at least one programming language - Python, Java, Reactjs, SQL. - Familiarity with alternative investments and their reporting requirements. - Hands-on experience in relational databases and complex query authoring. - Ability to thrive in a fast-paced work environment with quick iterations. - Must be able to work out of our Bangalore office. Preferred Qualifications: - Knowledge of AWS/Azure services. - Previous experience in asset management/private equity domain. This role provides an exciting opportunity to work in a fast-paced, engineering-focused startup environment and contribute to meaningful projects that address complex business challenges. Join our team and become part of a culture that values innovation, collaboration, and excellence. FS Investments: 30 years of leadership in private markets FS Investments is an alternative asset manager focused on delivering attractive returns across private equity, private credit, and real estate. With the recent acquisition of Portfolio Advisors in 2023, FS Investments now manages over $85 billion for institutional and wealth management clients globally. With over 30 years of experience and more than 500 employees across nine global offices, the firm's investment professionals oversee a variety of strategies across private markets and maintain relationships with 300+ sponsors. FS Investments" active partnership model fosters superior market insights and deal flow, informing the underwriting process and contributing to strong returns. FS is an Equal Opportunity Employer. FS Investments does not accept unsolicited resumes from recruiters or search firms. Any resume or referral submitted without a signed agreement is the property of FS Investments, and no fee will be paid.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Technology Lead Analyst role at our organization involves working closely with the Technology team to establish and implement new or updated application systems and programs. Your primary responsibility will be to lead applications systems analysis and programming activities. As the Applications Development Technology Lead Analyst, you will collaborate with various management teams to ensure seamless integration of functions to achieve organizational goals. You will also be responsible for identifying necessary system enhancements for deploying new products and process improvements. Additionally, you will play a key role in resolving high-impact problems and projects by evaluating complex business processes and industry standards. Your expertise in applications programming will be crucial in ensuring that application design aligns with the overall architecture blueprint. You will need to have a deep understanding of system flow and develop coding, testing, debugging, and implementation standards. Furthermore, you will be expected to have a comprehensive knowledge of how different business areas integrate to achieve business objectives. In this position, you will provide in-depth analysis and innovative solutions to address issues effectively. You will also serve as an advisor or coach to mid-level developers and analysts, assigning work as needed. It is essential to assess risks carefully when making business decisions, with a focus on upholding the firm's reputation and complying with relevant laws and regulations. To qualify for this role, you should have 6-10 years of relevant experience in Apps Development or systems analysis. You must also possess extensive experience in system analysis and software application programming, along with a track record of managing and implementing successful projects. Being a Subject Matter Expert (SME) in at least one area of Applications Development will be advantageous. A Bachelor's degree or equivalent experience is required, while a Master's degree is preferred. The ability to adjust priorities swiftly, demonstrated leadership and project management skills, and clear written and verbal communication are also essential qualifications for this position. The job description provides an overview of the typical responsibilities associated with this role. As a Vice President (VP) in this capacity, you will lead a specific technical vertical (Frontend, Backend, or Data), mentor developers, and ensure timely, scalable, and testable delivery within your domain. Your responsibilities will include leading a team of engineers, translating architecture into execution, reviewing complex components, and driving data platform migration projects. Additionally, you will be expected to evaluate and implement AI-based tools for enhanced productivity, testing, and code improvement. The required skills for this role include having 10-14 years of experience in leading development teams, delivering cloud-native solutions, and proficiency in programming languages such as Java, Python, and JavaScript/TypeScript. Familiarity with frameworks like Spring Boot/WebFlux, Angular, Node.js, databases including Oracle and MongoDB, cloud technologies such as ECS, S3, Lambda, and Kubernetes, as well as data technologies like Apache Spark and Snowflake, are also essential. Strong mentoring, conflict resolution, and cross-team communication skills are important attributes for success in this position.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in BI and Analytics (Alteryx, SQL, Tableau), and you have found the right team. As a Business Intelligence Developer Associate within our Asset and Wealth Management Finance Transformation and Analytics team, you will be tasked with defining, refining, and achieving set objectives for our firm on a daily basis. You will be responsible for designing the technical and information architecture for the MIS (DataMarts) and Reporting Environments. Additionally, you will support the MIS team in query optimization and deployment of BI technologies, including but not limited to Alteryx, Tableau, MS SQL Server (T-SQL programming), SSIS, and SSRS. You will scope, prioritize, and coordinate activities with the product owners, design and develop complex queries for data inputs, and work on agile improvements by sharing experiences and knowledge with the team. Furthermore, you will advocate and steer the team to implement CI/CD (DevOps) workflow and design and develop complex dashboards from large and/or different data sets. The ideal candidate for this position will be highly skilled in reporting methodologies, data manipulation & analytics tools, and have expertise in the visualization and presentation of enterprise data. Required qualifications, capabilities, and skills include a Bachelor's Degree in MIS, Computer Science, or Engineering. A different field of study with significant professional experience in BI Development is also acceptable. Strong DW-BI skills are required with a minimum of 7 years of experience in Data warehouse and visualization. You should have strong work experience in data wrangling tools like Alteryx and working proficiency in Data Visualizations Tools, including but not limited to Alteryx, Tableau, MS SQL Server (SSIS, SSRS). Working knowledge in querying data from databases such as MS SQL Server, Snowflake, Databricks, etc., is essential. You must have a strong knowledge of designing database architecture, building scalable visualization solutions, and the ability to write complicated yet efficient SQL queries and stored procedures. Experience in building end-to-end ETL processes, working with multiple data sources, handling large volumes of data, and converting data into information is required. Experience in the end-to-end implementation of Business Intelligence (BI) reports & dashboards, as well as good communication and analytical skills, are also necessary. Preferred qualifications, capabilities, and skills include exposure to Data Science and allied technologies like Python, R, etc. Exposure to automation tools like UIPath, Blue Prism, Power Automate, etc., working knowledge of CI/CD workflows and automated deployment, and experience with scheduling tools like Control M are considered advantageous.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The Marketing Cloud Technical Design Architect at Novartis DDIT, Hyderabad, plays a key role in translating business requirements into IT solution design specifications. You will collaborate with business customers and Strategic Business Partners to analyze demands, propose solutions, and provide funding estimates. Your responsibilities include contributing to technology delivery, leading Rapid-Prototyping engagements, ensuring on-time delivery of engagements, engaging with SI Partners, and driving enterprise-grade Solution Design and Architecture. You will also be responsible for DevSecOps management, following industry trends, ensuring security and compliance, and enhancing user experience. To qualify for this role, you should have a university degree in a business/technical area with at least 8 years of experience in Solution Design, including 3 years in Salesforce Marketing Cloud. Marketing Cloud certifications are advantageous. You must have practical knowledge of Marketing Automation projects, Salesforce Marketing Cloud integrations, data modeling, AMPScript, SQL, and Data Mapping. Proficiency in HTML, CSS, and tools that integrate with Marketing Cloud is preferred. Experience in managing global Marketing Automation projects, knowledge of Marketing automation concepts, and familiarity with tools like Data Cloud, CDP, MCP, MCI, Google Analytics, Salesforce CRM, MDM, and Snowflake are required. Novartis is dedicated to reimagining medicine to enhance and prolong lives, with a vision to become the most valued and trusted pharmaceutical company globally. By joining Novartis, you will be part of a mission-driven organization that values diversity and inclusion. If you are a dependable professional with excellent communication skills, attention to detail, and the ability to work in a fast-paced, multicultural environment, this role offers an opportunity to contribute to groundbreaking healthcare advancements. Novartis is committed to fostering an inclusive work environment and building diverse teams that reflect the patients and communities we serve. If you are looking to be part of a community of dedicated individuals working towards a common goal of improving patient outcomes, consider joining the Novartis Network to stay informed about future career opportunities. Novartis offers a range of benefits and rewards to support your personal and professional growth. If you are passionate about making a difference in the lives of patients and are eager to collaborate with like-minded individuals, explore the opportunities at Novartis and be part of a community focused on creating a brighter future together. For more information about Novartis and our commitment to diversity and inclusion, visit: https://www.novartis.com/about/strategy/people-and-culture To stay connected and learn about future career opportunities at Novartis, join our talent community here: https://talentnetwork.novartis.com/network To learn more about the benefits and rewards offered by Novartis, read our handbook: https://www.novartis.com/careers/benefits-rewards,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The Data Services ETL Developer specializes in data transformations and integration projects using Zeta's tools, 3rd Party software, and coding. Understanding CRM methodologies related to marketing operations is essential. Responsibilities include manipulating client and internal marketing data across various platforms, automating scripts for data transfer, building and managing cloud-based data pipelines using AWS services, managing tasks with competing priorities, and collaborating with technical staff to support a proprietary ETL environment. Collaborating with database/CRM, modelers, analysts, and application programmers is crucial for delivering results to clients. The ideal candidate should cover the US time-zone, be in the office a minimum of three days per week, have experience in database marketing, knowledge of US and International postal addresses (including SAP postal products), proficiency with AWS services (S3, Airflow, RDS, Athena), experience with Oracle and Snowflake SQL, familiarity with various tools like Snowflake, Airflow, GitLab, Grafana, LDAP, Open VPN, DCWEB, Postman, and Microsoft Excel. Additionally, knowledge of SQL Server, SFTP, PGP, large-scale customer databases, project life cycle, and proficiency with editors like Notepad++ and Ultra Edit is required. Strong communication, collaboration skills, and the ability to manage multiple tasks simultaneously are essential. Minimum qualifications include a Bachelors degree or equivalent with 5+ years of experience in database marketing and cloud-based technologies, a strong understanding of data engineering concepts and cloud infrastructure, as well as excellent oral and written communication skills.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Agivant is seeking a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. Responsibilities: Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines. Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS. Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs. Implement data quality checks and monitoring to ensure data integrity and identify potential issues. Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes. Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging technologies in data engineering. Contribute to the development and enhancement of our data warehouse architecture. Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes. - At least 3+ years of experience in Snowflake data warehousing technologies. - At least 3+ years of experience in creating and maintaining Airflow ETL pipelines. - Minimum 3+ years of professional level experience with Python languages for data manipulation and automation. - Working experience with Elastic Search and its application in data pipelines. - Proficiency in SQL and experience with data modeling techniques. - Strong understanding of cloud-based data storage solutions such as AWS S3. - Experience working with NFS and other file storage systems. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Sykatiya Technology Pvt Ltd is a leading Semiconductor Industry innovator committed to leveraging cutting-edge technology to solve complex problems. We are currently looking for a highly skilled and motivated Data Scientist to join our dynamic team and contribute to our mission of driving innovation through data-driven insights. As the Lead Data Scientist and Machine Learning Engineer at Sykatiya Technology Pvt Ltd, you will play a crucial role in analyzing large datasets to uncover patterns, develop predictive models, and implement AI/ML solutions. Your responsibilities will include working on projects involving neural networks, deep learning, data mining, and natural language processing (NLP) to drive business value and enhance our products and services. Key Responsibilities: - Lead the design and implementation of machine learning models and algorithms to address complex business problems. - Utilize deep learning techniques to enhance neural network models and enhance prediction accuracy. - Conduct data mining and analysis to extract actionable insights from both structured and unstructured data. - Apply natural language processing (NLP) techniques for advanced text analytics. - Develop and maintain end-to-end data pipelines, ensuring data integrity and reliability. - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. - Mentor and guide junior data scientists and engineers in best practices and advanced techniques. - Stay updated with the latest advancements in AI/ML, neural networks, deep learning, data mining, and NLP. Technical Skills: - Proficiency in Python and its libraries such as NumPy, pandas, sci-kit-learn, TensorFlow, Keras, and PyTorch. - Strong understanding of machine learning algorithms and techniques. - Extensive experience with neural networks and deep learning frameworks. - Hands-on experience with data mining and analysis techniques. - Proficiency in natural language processing (NLP) tools and libraries like NLTK, spaCy, and transformers. - Proficiency in Big Data Technologies including Sqoop, Hadoop, HDFS, Hive, and PySpark. - Experience with Cloud Platforms such as AWS services like S3, Step Functions, EventBridge, Athena, RDS, Lambda, and Glue. - Strong knowledge of Database Management systems like SQL, Teradata, MySQL, PostgreSQL, and Snowflake. - Familiarity with Other Tools like ExactTarget, Marketo, SAP BO, Agile, and JIRA. - Strong Analytical Skills to analyze large datasets and derive actionable insights. - Excellent Problem-Solving Skills with the ability to think critically and creatively. - Effective Communication Skills and teamwork abilities to collaborate with various stakeholders. Experience: - At least 8 to 12 years of experience in a similar role.,
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough