Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
As an Architecture Senior Advisor at Evernorth Health Services, a division of The Cigna Group, you will play a crucial role in transforming healthcare and improving the lives of millions of members. By leveraging cutting-edge Data & Analytics technologies, you will be responsible for designing and architecting enterprise-grade data analytics solutions, focusing on building scalable cloud infrastructure using AWS. Your work will impact millions of customers, members, and employers who rely on The Cigna Group daily. Your primary responsibilities will include designing and architecting data analytics solutions that align with industry best practices and organizational goals, developing effective data pipelines and validation mechanisms, leading project teams in setting technical direction for large-scale data initiatives, collaborating with overseas HIH resources, advocating for best practices and governance, exhibiting attention to detail in system design, demonstrating a proactive mindset for problem-solving, and effectively communicating technical concepts to diverse audiences. To excel in this role, you must possess extensive experience in data architecture and cloud infrastructure, a strong background in designing robust data pipelines and workflows, knowledge of software engineering principles, leadership experience, experience working in matrix organizations, strong analytical and problem-solving skills, a proactive and independent work style, excellent communication skills, and attention to detail in system design. Preferred skills include experience with Databricks, Snowflake, and Teradata platforms, familiarity with data governance and security practices in cloud environments, proficiency in SQL, Python, or Scala, and expertise in designing architecture, building API integrations, configuring and deploying cloud services. The ideal candidate will have 13 to 16 years of experience in data architecture or solution architecture, particularly in data analytics or cloud computing, proven experience building enterprise-level data and analytics solutions on AWS, at least 5 years of leadership experience, a minimum of 6 years of experience working in a matrix organization, and a Bachelor's degree in Computer Science, Information Systems, Artificial Intelligence, or a related field.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
navi mumbai, maharashtra
On-site
The ideal candidate will be responsible for designing and implementing streaming data pipelines that integrate Kafka with Databricks using Structured Streaming. You will also be tasked with architecting and maintaining the Medallion Architecture, which consists of well-defined Bronze, Silver, and Gold layers. Additionally, you will need to implement efficient data ingestion processes using Databricks Autoloader for high-throughput data loads. You will work with large volumes of structured and unstructured data to ensure high availability and performance, applying performance tuning techniques like partitioning, caching, and cluster resource optimization. Collaboration with cross-functional teams, including data scientists, analysts, and business users, is essential to build robust data solutions. The role also involves establishing best practices for code versioning, deployment automation, and data governance. The required technical skills for this position include strong expertise in Azure Databricks and Spark Structured Streaming, along with at least 7 years of experience in Data Engineering. You should be familiar with processing modes (append, update, complete), output modes (append, complete, update), checkpointing, and state management. Experience with Kafka integration for real-time data pipelines, a deep understanding of Medallion Architecture, proficiency with Databricks Autoloader and schema evolution, and familiarity with Unity Catalog and Foreign catalog are also necessary. Strong knowledge of Spark SQL, Delta Lake, and DataFrames, expertise in performance tuning, data management strategies, governance, access management, data modeling, data warehousing concepts, and Databricks as a platform, as well as a solid understanding of Window functions will be beneficial in this role.,
Posted 1 week ago
3.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Responsibilities Job Description : Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks. Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologie Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform, Extract Transform Load (ETL), PySpark, Python (Programming Language), Structured Query Language (SQL) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Test Engineer, you will be responsible for supporting the AI & Data science teams in testing their AI/ML flows. You will analyze system specifications, develop detailed test plans and test cases, execute test cases, and identify defects. Your role includes documenting and reporting defects to the development team, collaborating with them to resolve issues, and ensuring that the software meets quality standards and best practices. Participation in review meetings to provide feedback is also part of your responsibilities. It is essential to have excellent knowledge of SDLC and STLC, along with expertise in Agile methodology. Your technical skills should include a strong understanding of Testing framework & automation concepts, as well as proficiency in Pandas, Python, Pytest, SQL, SparkSQL, Pyspark, and testing LLMs such as GPT, LLAMA, Gemma. Additionally, good database skills in any relational DB, hands-on experience with the Databricks Platform, and the ability to comprehend models and write Python scripts to test data inflow and outflow are required. You should also be proficient in Programming and Query Language, with knowledge of Cloud platforms, preferably Azure fundamentals and Azure analytics services. Writing test scripts from designs and expertise in Jira, Excel, and Confluence are important technical skills. In terms of soft skills, excellent verbal and written communication skills in English are necessary. You should be able to work both independently and as part of a team, demonstrating strong project leadership and communication skills, including customer-facing interactions. It would be beneficial to have skills in API testing, Test automation, Azure AI services, and any vector dB/Graph dB. Familiarity with ML NLP algorithms, entity mining & clustering, and sentiment analysis is also considered advantageous for this role.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Technical Lead - Azure Databricks at CitiusTech, you will join an Agile team to contribute to the design and development of healthcare applications by implementing new features while ensuring adherence to the best coding development standards. Your responsibilities will include creating and configuring Databricks workspaces across various environments, setting up clusters, pools, and workspace permissions according to enterprise standards, designing and developing reusable Databricks notebooks for data ingestion, transformation, and analytics, implementing scalable Python scripts for large-scale data processing, orchestrating Databricks Workflows to automate notebook execution, DLT pipelines, and ML model deployment, as well as building and managing Delta Live Tables pipelines for real-time and batch data processing. You will also be tasked with ensuring data quality, lineage, and monitoring using DLT best practices, developing and deploying MLflow pipelines within Databricks for model tracking, versioning, and deployment, collaborating with data scientists to operationalize machine learning models using MLflow, leveraging Unity Catalog for data governance, access control, and lineage tracking, and ensuring compliance with data security and privacy standards. To excel in this role, you should have 7-8 years of experience and hold an Engineering Degree in BE / ME / BTech / MTech / BSc / MSc. Mandatory technical skills include proven experience in Databricks platform, proficiency in Python, hands-on experience with Delta Live Tables (DLT) and MLflow, familiarity with Unity Catalog, experience in Cloud platforms like Azure, AWS, or GCP, understanding of CI/CD practices and DevOps in data engineering, as well as excellent problem-solving and communication skills. CitiusTech is committed to combining IT services, consulting, products, accelerators, and frameworks with a client-first mindset and next-gen tech understanding to humanize healthcare and make a positive impact on human lives. Our culture fosters innovation and continuous learning, promoting a fun, transparent, non-hierarchical, and diverse work environment focused on work-life balance. By joining CitiusTech, you will have the opportunity to collaborate with global leaders, shape the future of healthcare, and positively impact human lives. Discover more about CitiusTech by visiting https://www.citiustech.com/careers and be part of a team that is dedicated to Faster Growth, Higher Learning, and Stronger Impact. Happy applying!,
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
SLSQ326R837 As an Account Executive, your mission will be to help further build our India business, which is one of our fastest-growing markets in APJ. The Databricks Sales Team is driving growth through strategic and innovative partnerships with our customers, helping businesses thrive by solving the world&aposs toughest problems with our solutions. You will be inspiring and guiding customers on their data journey, making organisations more collaborative and productive than ever before. You will play an important role in the business in India, with the opportunity to strategically build your territory in close partnership with the business leaders. Using your passion with technology and drive to build, you will help businesses all across India reach their full potential through the power of Databricks. You know how to sell innovation and change and can guide deals forward to compress decision cycles. You love understanding a product in-depth and are passionate about communicating its value to customers and partners. Always prospecting for new opportunities, you will close new accounts while growing our business in existing accounts. The Impact You Will Have Identify top VCs in the region to evangelise Databricks Platform Work closely with Account managers to help them on-board Startups on Databricks Startup Program Work closely to devise strategy leadership across India, Asean and Japan to scale startup business across the region Run cadences with VC to cultivate long term relationships across their portfolio Assess your existing customers and develop a strategy to identify and engage all buying centres Use a solution approach to selling and creating value for customers Identify the most viable use cases in each account to maximise Databricks' impact Orchestrate and work with teams to maximise the impact of the Databricks ecosystem on your territory Build value with all engagements to promote successful negotiations and close Promote the Databricks enterprise cloud data platform Be customer-focused by delivering technical and business results using the Databricks Platform Promote Teamwork What We Look For You have previously worked in an early-stage company, and you know how to navigate and be successful in a fast-growing organisation Min 5+ years of sales experience in SaaS/PaaS or big data companies Prior customer relationships with CIOs and important decision-makers Simply articulate intricate cloud technologies and big data 3+ years of experience exceeding sales quotas Success in closing new accounts while upselling existing accounts About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide including Comcast, Cond Nast, Grammarly, and over 50% of the Fortune 500 rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer&aposs discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone. Show more Show less
Posted 1 month ago
5.0 - 10.0 years
2 - 7 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Description: Application Developer Project Role Title: Application Developer Description: Design, build, and configure applications to meet business process and application requirements. Skills & Qualifications Must Have Skills: Databricks Unified Data Analytics Platform Good to Have Skills: N/A Minimum Experience Required: 5 years Educational Qualification: 15 years full-time education Summary As an Application Developer , you will be responsible for designing, building, and configuring applications to meet business process and application requirements at our Bhubaneswar office . Your typical day will involve developing innovative solutions to address business needs while collaborating with cross-functional teams to ensure successful project delivery. Roles & Responsibilities Act as a Subject Matter Expert (SME) . Collaborate and manage the team to ensure performance and efficiency. Make key team decisions and provide strategic direction. Engage with multiple teams and contribute to critical business decisions . Provide solutions to problems for both the immediate team and cross-functional teams. Lead the development and implementation of new software applications . Conduct code reviews and ensure adherence to coding standards. Troubleshoot and resolve complex technical issues . Professional & Technical Skills Must Have: Proficiency in Databricks Unified Data Analytics Platform . Strong understanding of data analytics and data processing . Experience in developing and deploying applications using Databricks. Knowledge of cloud computing and data storage solutions . Hands-on experience with data modeling and database design . Additional Information The candidate must have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform . This position is based at our Bhubaneswar office . A 15-year full-time education is required.
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Databricks platform administrator, you will be responsible for managing the Databricks platform and associated cloud resources. Your primary focus will be on ensuring the optimal performance, security, and efficiency of Databricks clusters and workspaces. This is a long-term contract role based in Bengaluru, Karnataka, with a hybrid work arrangement. You must have at least 5 years of experience working with the Databricks platform specifically as an administrator, not as a data engineer. In addition, cloud experience is required for this role. Your responsibilities will include configuring, deploying, and maintaining Databricks clusters and workspaces using tools like Terraform. You will monitor cluster performance, troubleshoot issues, and optimize configurations for performance and cost-effectiveness. Security is a key aspect of the role, as you will be managing access controls, encryption mechanisms, and implementing security policies to protect sensitive data. Collaboration is essential in this role, as you will work closely with application development teams, data engineers, data scientists, and business analysts to understand their requirements and provide technical solutions. You will also conduct training sessions to educate users on platform best practices and capabilities. In addition, you will be responsible for managing platform costs, implementing backup and disaster recovery strategies, and integrating Databricks with other data sources, data warehouses, and data lakes. Working within an Agile delivery/DevOps methodology, you will support the application development teams in debugging and issue resolution. Overall, as a Databricks platform administrator, you will play a crucial role in ensuring the smooth operation and continuous improvement of the Databricks platform to meet the organization's data processing and analytics needs.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |