Jobs
Interviews

6093 Scala Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 4.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Data Science Trainer, your primary responsibility will be to deliver advanced training sessions in various topics related to data science. You will cover a wide range of subjects including Python Programming for Data Science, Mathematics & Statistics for AI, AI Ethics, Data Access, Handling and Visualization, Analyzing Data with Python, Visualizing Data with Python, Tableau, Machine Learning, Deep Learning, Natural Language Processing, Computer Vision (CV), Full Stack Development, Generative AI, R Programming, Scala, and Spark. This is a full-time position that requires you to conduct training sessions in person. You will be expected to start on the 23rd of July, 2025.,

Posted 2 weeks ago

Apply

5.0 years

15 - 25 Lacs

Gurugram, Haryana, India

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica

Posted 2 weeks ago

Apply

5.0 years

15 - 25 Lacs

Chennai, Tamil Nadu, India

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About Us : CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance. Our Values : We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Equal Opportunity Statement : CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. What we are looking for: Experience: 5-10years Education: BTech / BE / MCA / MSc Computer Science Position Overview : Seeking an experienced Data Engineer to design, develop, and productionize graph database solutions using Neo4j for economic data analysis and modeling. This role requires expertise in graph database architecture, data pipeline development, and production system deployment. Key Responsibilities : Graph Database Development : - Design and implement Neo4j graph database schemas for complex economic datasets - Develop efficient graph data models representing economic relationships, transactions, and market dynamics - Create and optimize Cypher queries for complex analytical workloads - Build graph-based data pipelines for real-time and batch processing Data Engineering & Pipeline Development : - Architect scalable data ingestion frameworks for structured and unstructured economic data - Develop ETL/ELT processes to transform relational and time-series data into graph formats - Implement data validation, quality checks, and monitoring systems - Build APIs and services for graph data access and manipulation Production Systems & Operations : - Deploy and maintain Neo4j clusters in production environments - Implement backup, disaster recovery, and high availability solutions - Monitor database performance, optimize queries, and manage capacity planning - Establish CI/CD pipelines for graph database deployments Economic Data Specialization : - Model financial market relationships, economic indicators, and trading networks - Create graph representations of supply chains, market structures, and economic flows - Develop graph analytics for fraud detection, risk assessment, and market analysis - Collaborate with economists and analysts to translate business requirements into graph solutions Required Qualifications : Technical Skills : - Neo4j Expertise- : 3+ years hands-on experience with Neo4j database development - Graph Modeling- : Strong understanding of graph theory and data modeling principles - Cypher Query Language- : Advanced proficiency in writing complex Cypher queries - Programming- : Python, Java, or Scala for data processing and application development - Data Pipeline Tools- : Experience with Apache Kafka, Apache Spark, or similar frameworks - Cloud Platforms- : AWS, GCP, or Azure with containerization (Docker, Kubernetes) Database & Infrastructure : - Experience with graph database administration and performance tuning - Knowledge of distributed systems and database clustering - Understanding of data warehousing concepts and dimensional modeling - Familiarity with other databases (PostgreSQL, MongoDB, Elasticsearch) Economic Data Experience : - Experience working with financial datasets, market data, or economic indicators - Understanding of financial data structures and regulatory requirements - Knowledge of data governance and compliance in financial services Preferred Qualifications : -Neo4j Certification- : Neo4j Certified Professional or Graph Data Science certification -Advanced Degree- : Master's in Computer Science, Economics, or related field -Industry Experience- : 5+ years in financial services, fintech, or economic research -Additional Skills- : Machine learning on graphs, network analysis, time-series analysis Technical Environment : - Neo4j Enterprise Edition with APOC procedures - Apache Kafka for streaming data ingestion - Apache Spark for large-scale data processing - Docker and Kubernetes for containerized deployments - Git, Jenkins/GitLab CI for version control and deployment - Monitoring tools: Prometheus, Grafana, ELK stack Application Requirements : - Portfolio demonstrating Neo4j graph database projects - Examples of production graph systems you've built - Experience with economic or financial data modeling preferred

Posted 2 weeks ago

Apply

4.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About the Company Vlink is hiring for Java Developer for Noida, Gurugram, Bangalore, Hyderabad. About the Role Experience: 4 to 7 years. Notice Period: Immediate Joiner Only. Work Mode: Work From Office. Responsibilities Experience in building Order and Execution Management, Trading systems is required. Financial experience and exposure to Trading. In depth understanding of concurrent programming and experience in designing high throughput, high availability, fault tolerant distributed applications is required. Experience in building distributed applications using NoSQL technologies like Cassandra, coordination services like Zookeeper, and caching technologies like Apache Ignite and Redis strongly preferred. Experience in building micro services architecture / SOA is required. Experience in message oriented streaming middleware architecture is preferred (Kafka, MQ, NATS, AMPS). Experience with orchestration, containerization, and building cloud native applications (AWS, Azure) is a plus. Experience with modern web technology such as Angular, React, TypeScript a plus. Strong analytical and software architecture design skills with an emphasis on test driven development. Experience in programming languages such as Scala, Python would be a plus. Experience in using Project Management methodologies such as Agile/Scrum. Effective communication and presentation skills (written and verbal) are required. Regards As Ever Ankit Malik

Posted 2 weeks ago

Apply

5.0 years

15 - 25 Lacs

Greater Kolkata Area

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

NVIDIA is continuously reinventing itself, with the invention of the GPU sparking the growth of the PC gaming market, redefining modern computer graphics, and revolutionizing parallel computing. In today's world, research in artificial intelligence is thriving globally, demanding highly scalable and massively parallel computation horsepower where NVIDIA GPUs excel. As a learning machine, NVIDIA constantly evolves by embracing new opportunities that are challenging, unique, and impactful to the world. Our mission is to amplify human creativity and intelligence. Join our diverse team and discover how you can make a lasting impact on the world! We are seeking a Senior Software Engineer to contribute to enhancing our HPC infrastructure. You will collaborate with a team of passionate engineers dedicated to developing and managing sophisticated infrastructure for business critical services and AI applications. The ideal candidate will possess expertise in software development, designing reliable distributed systems, and implementing long-term maintenance strategies. **Responsibilities:** - Design highly available and scalable systems for our HPC clusters - Explore new technologies to adapt to the evolving landscape - Enhance infrastructure provisioning and management through automation - Support a globally distributed, multi-cloud hybrid environment (AWS, GCP, and On-prem) - Foster cross-functional relationships and partnerships across business units - Ensure operational excellence, high uptime, and Quality of Service (QoS) for users - Participate in the team's on-call rotation and respond to service incidents **Requirements:** - 5+ years of experience in designing and delivering large engineering projects - Proficiency in at least two of the following programming languages: Golang, Java, C/C++, Scala, Python, Elixir - Understanding of scalability challenges and server-side code performance - Experience with full software development lifecycle and cloud platforms (GCP, AWS, or Azure) - Familiarity with modern CI/CD techniques, GitOps, and Infrastructure as Code (IaC) - Strong problem-solving skills, work ethic, and attention to detail - Bachelor's degree in Computer Science or related technical field (or equivalent experience) - Excellent communication and collaboration skills **Preferred Qualifications:** - Previous experience in developing solutions for HPC clusters using Slurm or Kubernetes - Strong knowledge of Linux operating system and TCP/IP fundamentals Join us in our mission to innovate and elevate the capabilities of our HPC infrastructure. Be part of a dynamic team that is committed to pushing boundaries and achieving operational excellence. Make your mark at NVIDIA and contribute to shaping the future of technology. *(JR1983750)*,

Posted 2 weeks ago

Apply

5.0 years

15 - 25 Lacs

Pune, Maharashtra, India

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID 30181671 Job Category Digital Technology Job Title – Sr. Architect (Data and Integration) Preferred Location - Bangalore India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Responsibilities: Enterprise Data & Integration Strategy Define and drive the enterprise data and integration vision, ensuring alignment with business and IT objectives. Establish best practices for API-led connectivity, data pipelines, and cloud-native architectures. Lead the implementation of standardized integration patterns, including data lakes, event-driven processing, and distributed computing. Ensure all data solutions are resilient, secure, and compliant with industry regulations. Technical Leadership & Execution Architect and oversee the deployment of scalable and high-performance data platforms and integration solutions. Partner with engineering, analytics, and IT teams to design and implement data-driven capabilities. Optimize data processing workflows for security, performance, and cost-efficiency. Assess and integrate emerging technologies, including AI/ML and advanced analytics frameworks, into the data strategy. Governance, Security & Compliance Establish enterprise-wide data governance frameworks to ensure data accuracy, security, and compliance. Implement advanced monitoring, logging, and alerting strategies to maintain high availability of data services. Drive automation in data quality management, security enforcement, and integration testing using DevOps methodologies. Work closely with risk and compliance teams to ensure adherence to data privacy regulations (GDPR, CCPA, etc.). Role Purpose: Data & Integration Architect (Technical leadership) will be responsible for shaping the enterprise-wide data and integration strategy, driving innovation, and overseeing the implementation of large-scale data solutions. This role requires a deep technical expertise in data engineering, API integrations, and real-time data processing to enable seamless interoperability across enterprise applications. The successful candidate will provide strategic direction, mentor technical teams, and work closely with business leaders to implement data frameworks that support analytics, automation, and digital transformation at scale. Minimum Requirements: 12+ years of experience in enterprise data architecture, integration, or software engineering leadership roles. Proven expertise in designing and managing complex data architectures, including data lakes, data warehouses, and real-time streaming platforms. Hands-on experience with enterprise integration tools such as Boomi, MuleSoft, Kafka, AWS Glue, or equivalent. Deep understanding of API management, authentication mechanisms (OAuth2, SAML), and data security best practices. Strong experience integrating large-scale ERP, CRM, and HR systems (SAP, Salesforce, Workday, etc.). Proficiency in DevOps, CI/CD pipelines, and cloud infrastructure (AWS, Azure, GCP). Experience with AI/ML-driven data solutions and predictive analytics. Hands-on expertise with big data technologies such as Spark, Flink, or Databricks. Strong programming skills in Python, Java, or Scala for data transformation and automation. Industry certifications in cloud computing, data management, or integration technologies. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Programmer Analyst position is an intermediate level role where you will participate in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts to identify and define necessary system enhancements. You will be expected to identify and analyze issues, make recommendations, and implement solutions based on your understanding of business processes, system processes, and industry standards to solve complex issues. Your role will also involve analyzing information to recommend solutions and improvements, conducting testing and debugging, writing basic code for design specifications, and assessing risk when making business decisions. To excel in this role, you should have hands-on experience in designing, developing, and optimizing scalable distributed data processing pipelines using Apache Spark and Scala. You should possess 3 to 6+ years of experience in big data development, focusing on Apache Spark, Scala/Python, and distributed systems. Advanced knowledge of Scala, a good understanding of Python for data engineering tasks, solid understanding of data modeling principles and ETL processes in big data environments, strong analytical and problem-solving skills in analyzing and solving performance issues in Spark jobs and distributed systems, familiarity with Git, Jenkins, and other CI/CD tools for automating the deployment of big data applications, as well as experience with streaming platforms such as Apache Kafka or Spark Streaming. You should hold a Bachelor's degree/University degree or equivalent experience to be considered for this position. This job description provides an overview of the work performed, and other job-related duties may be assigned as required.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Specialist in the S&C GN AI - Insurance AI Consulting team at Accenture, you will play a crucial role in driving strategic initiatives, managing business transformations, and leveraging industry expertise to deliver value-driven solutions. Your responsibilities will include providing strategic advisory services, conducting market research, and developing data-driven recommendations to enhance business performance. In this role, you will work with the team to architect, design, build, deploy, deliver, and monitor advanced analytics models, including Generative AI, for various client problems. You will also be responsible for developing functional aspects of Generative AI pipelines and interfacing with clients to understand engineering and business problems, translating them into analytics problems that deliver actionable insights and operational improvements. To excel in this position, you are required to have a minimum of 2-4 years of experience in data-driven techniques, including exploratory data analysis and data pre-processing, as well as a Bachelor's degree in Mathematics, Statistics, Economics, Computer Science, or a related field. An MBA degree is mandatory, along with a solid foundation in Statistical Modeling, Machine Learning algorithms, and GenAI. Proficiency in programming languages such as Python, PySpark, SQL, or Scala is essential, as well as strong communication and presentation skills. Accenture Global Network offers a dynamic and collaborative environment where you will have the opportunity to work on innovative projects, grow your career, and gain leadership exposure. Join us in pushing the boundaries of business capabilities and delivering impactful solutions to clients. If you are passionate about leveraging Generative AI, design thinking, business process optimization, and stakeholder management skills to drive business success, we invite you to explore this exciting opportunity with Accenture.,

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

Are you a skilled professional with experience in SQL, Python (Pandas & SQLAlchemy), and data engineering We have an exciting opportunity for an ETL Developer to join our team! As an ETL Developer, you will be responsible for working with MS SQL, Python, and various databases to extract, transform, and load data for insights and business goals. You should have a Bachelor's degree in Computer Science or a related field, or equivalent work experience. Additionally, you should have at least 5 years of experience working with MS SQL, 3 years of experience with Python (Pandas, SQLAlchemy), and 3 years of experience supporting on-call challenges. Key responsibilities include running SQL queries on multiple disparate databases, working with large datasets using Python and Pandas, tuning MS SQL queries, debugging data using Python and SQLAlchemy, collaborating in an agile environment, managing source control with GitLab and GitHub, creating and maintaining databases, interpreting complex data for insights, and familiarity with Azure, ADF, Spark, and Scala concepts. If you're passionate about data, possess a strong problem-solving mindset, and thrive in a collaborative environment, we encourage you to apply for this position. For more information or to apply, please send your resume to samdarshi.singh@mwidm.com or contact us at +91 62392 61536. Join us in this exciting opportunity to contribute to our data engineering team! #ETLDeveloper #DataEngineer #Python #SQL #Pandas #SQLAlchemy #Spark #Azure #Git #TechCareers #JobOpportunity #Agile #DataAnalysis #SQLTuning #OnCallSupport,

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 2 weeks ago

Apply

9.0 - 20.0 years

0 Lacs

hyderabad, telangana

On-site

Salesforce is offering immediate opportunities for software developers who are passionate about creating impactful code that benefits users, the company, and the industry. Join a team of talented engineers to develop innovative features that customers will love, while ensuring the stability and scalability of our CRM platform. The software engineer role at Salesforce involves architecture, design, implementation, and testing to deliver high-quality products. You will have the chance to engage in code review, mentor junior engineers, and provide technical guidance to the team, depending on your seniority level. We prioritize writing maintainable code that enhances product stability and efficiency. Our team values individual strengths and encourages personal growth, believing that autonomous teams lead to empowered individuals who drive success for the product, company, and customers. Responsibilities for Principal, Lead, or Senior Engineers include: - Developing new components in a rapidly evolving market to enhance scalability and efficiency - Creating high-quality code for millions of application users - Making design decisions based on performance and scalability considerations - Contributing to all phases of the software development life cycle in a Hybrid Engineering model - Building efficient components in a multi-tenant SaaS cloud environment - Providing code review, mentorship, and technical guidance to junior team members Required Skills: - Proficiency in multiple programming languages and platforms - 9 to 20 years of software development experience - Domain knowledge in CCaaS/CPaaS/UCaaS - Experience with WebRTC, SIP, and telephony layer protocols - Strong object-oriented programming and scripting language skills - Proficiency in SQL and relational/non-relational databases - Development experience with SAAS applications on public cloud infrastructure - Knowledge of queues, locks, event-driven architecture, and workload distribution - Understanding of software development best practices and leadership skills - Degree or equivalent relevant experience required Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, parental leave, adoption assistance, and more - Access to on-demand training with Trailhead.com - Opportunities for exposure to executive leadership and coaching - Participation in volunteer activities and community giving initiatives For more information on benefits and perks, please visit https://www.salesforcebenefits.com/,

Posted 2 weeks ago

Apply

13.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Tech Specialist with strong analytical and technical ability with over 13 years of experience in enterprise Web applications, REST services and Workflow Processing Service development using Java/J2EE technologies. Experienced in working on medium to large enterprise projects, preferably in financial services Knowledge/Experience: Should have hands on experience on designing & development of scalable, distributed applications. Architect large scale of applications using spark, kafka & big data technologies Knowledge of Hadoop architecture. Knowledge of frameworks – Velocity, Springs, Spring Boot. Knowledge of OOAD, UML and design Patterns Should have strong insight on OOPS concept and good hands on experience on Java (version 1.8 or above) and other java-based frameworks like Spring Batch, Spring IOC, Spring Annotation, Spring Security. Should have hands on experience on messaging platform like Kafka. Good working knowledge of JBPM as BPMN Framework is must. Good working knowledge of Docker, Kubernetes and OpenShift is a must. Should have strong knowledge of Java design patterns, microservice design patterns, event streams, event/message-based architecture, Domain driven design etc. Should have strong knowledge of API based architecture and SOA. Expertise in Server less, tomcat (Embedded/Non-Embedded), jetty (Embedded/Non-Embedded), WebSphere, Spring Batch, Spring IOC, Spring Annotation, Spring Security Expertise in mocking, Junit and perf testing of solutions. Should possess basic Unix/Linux knowledge to be able to write and understand basic shell scripts and basic Unix commands Good working knowledge of in memory distributed caches (Hazelcast, Gemfire) is good to have. Person should have worked in Agile/DevOps Environment. knowledge on webserver setup and configuration with reverse proxy/ssl setup etc (preferred nginx webserver) is a plus Good to have skills: Financial markets background is preferable but is not a must Knowledge of testing concepts (TDD, BDD) is a plus. Knowledge of ELK/App Dynamics Knowledge of other programming languages like Vaadin (UI Framework), Kotlin, scala, shell scripting etc is good to have. Key Responsibilities: A seasoned SME and technical specialist in Client On boarding/AML/KYC/Account Opening domain Translate business requirements into technical documents/code Employ standards, frameworks and patterns while designing and developing components. Implement appropriate design standards, frameworks and patterns while designing and developing components Implement and maintain a suite of Workflow driven, Java application with RESTful services. Develop high quality code employing software engineering and testing best practices. Developing software that processes, persists and distributes data via relational and non-relational technologies Hands on coding, authoring unit tests/Junit, performance tests and maintaining high code quality. Needs to be able to react and provide quick turnaround to business requirements and management requests Well versed in Agile Development Life Cycle and capable to lead a team of developers. Partner with database developers to implement ingestion, orchestration, quality/reconciliation and distribution services Ability to work independently, good communication skills, has experience in working on complex and medium to large projects. Job Background: The position is based in India and is required to focus on delivery of the work, ensuring a robust design This role may report to the technology team lead based in Pune Candidate should be able to work independently and should be self-motivated Candidate might be required to work with vendors or third parties in joint delivery teams The role requires application of technical skills and knowledge of the business to develop solutions to meet business needs As part of large, geographically distributed team(s), the candidate may have to manage stakeholders across multiple functional areas The position requires analytical skills in order to filter, prioritize and validate potentially complex material, technical or business or otherwise, from multiple sources. The candidate will work with complex and variable issues with substantial potential impact, weighing various alternatives and balancing potentially conflicting needs Qualification: Bachelor’s degree (in science, computers, information technology or engineering) Candidate should be willing to work late in the evening India time on need basis in order to interact with US/other global teams ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

A career with our organization is a journey, not a destination. This opportunity could be the next best step in your technical career. Join us. As a Lead Architect at JPMorgan Chase within the Consumer and Community Banking division, you will play a crucial role in a team dedicated to creating high-quality architecture solutions for various software applications utilizing modern cloud-based technologies. In this position, you will serve as a key technical contributor responsible for developing critical architecture solutions across multiple technical domains to support project objectives. Responsibilities: - Collaborate with technical teams and business stakeholders to propose technical solutions that address current and future requirements - Define the technical target state of products and guide the realization of the strategic vision - Engage in architecture governance bodies to ensure adherence to architectural standards - Evaluate new technologies, provide feedback on recommendations, and drive innovation - Develop innovative software solutions, handle design and development tasks, and troubleshoot technical issues with a focus on unconventional problem-solving approaches - Write secure, high-quality production code, conduct code reviews, and troubleshoot code authored by colleagues - Identify opportunities for automation and improvement to enhance the operational stability of software applications and systems - Lead assessment sessions with external vendors, startups, and internal teams to evaluate architectural designs and technical viability for integration into existing systems Required Qualifications, Capabilities, and Skills: - Formal training or certification in software engineering concepts and a minimum of 5 years of practical experience - Hands-on experience in system design, application development, testing, and operational stability - Proficiency in at least one programming language - Strong knowledge of automation and continuous delivery methods - Comprehensive understanding of the Software Development Life Cycle and experience in enhancing engineering practices - Expertise in .Net/.Net Core or similar enterprise-level technologies, with a successful track record in leading software engineering teams - Exposure to JS stacks like Angular, React, Node, and TypeScript - Familiarity with XML, JSON, NoSQL, and relational databases - Experience in developing scalable data-driven applications and implementing Performance Engineering practices - Proficiency in CI/CD practices, containerization technologies (e.g., Docker, Kubernetes), and deploying enterprise-grade applications in cloud platforms such as AWS, Azure, or GCP - Minimum 4 years of Agile Methodology experience, with a commitment to promoting agile practices, high performance, teamwork, and sustainability within teams Preferred Qualifications, Capabilities, and Skills: - Keen interest in staying updated on industry trends and emerging technologies - Familiarity with functional programming techniques and languages like Scala, Clojure, or Lisp - Knowledge of Secure Software Development Lifecycle practices based on OWASP guidelines - Proficiency in Analysis and Design Patterns, including Object-Oriented Analysis and Design (OOAD), UML, MVVM, and Microservices,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Salesforce is currently seeking software developers who are passionate about creating impactful solutions for users, the company, and the industry. Join a team of talented engineers to design and develop innovative features that enhance our CRM platform's stability and scalability. As a software engineer at Salesforce, you will be involved in architecture, design, implementation, and testing to ensure the delivery of high-quality products to our customers. We take pride in writing maintainable code that strengthens product stability and simplifies our work processes. Our team values individual strengths and encourages personal growth. By empowering autonomous teams, we aim to foster a culture of innovation and excellence that benefits both our employees and customers. **Your Impact** As a Senior Backend Software Engineer at Salesforce, your responsibilities will include: - Building new components to enhance our technology offerings in a dynamic market - Developing high-quality code for our cloud platform used by millions of users - Designing, implementing, and optimizing APIs and API framework features for scalability - Contributing to all phases of software development life cycle in a Hybrid Engineering model - Creating efficient components for a multi-tenant SaaS cloud environment - Conducting code reviews, mentoring junior engineers, and providing technical guidance **Required Skills:** - Proficiency in multiple programming languages and platforms - 5+ years of experience in backend software development, including designing distributed systems - Deep knowledge of object-oriented programming and scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++ - Strong skills in PostgreSQL/SQL and experience with relational and non-relational databases - Understanding of software development best practices and leadership abilities - Degree or equivalent experience with relevant competencies **Preferred Skills:** - Experience with developing SAAS products on public cloud platforms like AWS, Azure, or GCP - Knowledge of Big Data/ML, S3, Kafka, Elastic Search, Terraform, Kubernetes, and Docker - Previous experience in a fast-paced, multinational organization **Benefits & Perks** - Comprehensive benefits package including well-being reimbursement, parental leave, adoption assistance, and more - Access to training resources on Trailhead.com - Mentorship opportunities with leadership and executive thought leaders - Volunteer programs and community engagement initiatives as part of our giving back model For further details, please visit [Salesforce Benefits Page](https://www.salesforcebenefits.com/),

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 professionals spanning across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant- Databricks Lead Developer. As a Databricks Developer in this role, you will be tasked with solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Keep abreast of new and emerging technologies and assess their potential application for service offerings and products. - Collaborate with architects and lead engineers to devise solutions that meet functional and non-functional requirements. - Demonstrate proficiency in understanding relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess experience in the Data Engineering domain. Qualifications we are looking for: Minimum qualifications: - Bachelor's Degree or equivalency in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - <<>> years of experience in IT. - Familiarity with new and emerging technologies and their possible applications for service offerings and products. - Collaboration with architects and lead engineers to develop solutions meeting functional and non-functional requirements. - Understanding of industry trends and standards. - Strong analytical and technical problem-solving abilities. - Proficiency in either Python or Scala, preferably Python. - Experience in the Data Engineering domain. Preferred qualifications: - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience with CI/CD for building Databricks job pipelines. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. If you are a proactive individual with a passion for innovation and a strong commitment to continuous learning and upskilling, we invite you to apply for this exciting opportunity to join our team at Genpact.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

Build the future of the AI Data Cloud by joining the Snowflake team. Snowflake is at the forefront of the data revolution, committed to creating the world's greatest data and applications platform. Our "get it done" culture ensures that everyone at Snowflake has an equal opportunity to innovate on new ideas, create work with a lasting impact, and excel in a collaborative environment. Snowflake's pre-sales organization is actively seeking an Associate Sales Engineer to join the Sales Engineering training program called Snowmaker. The purpose of Snowmaker is to nurture aspiring technical talent through a blend of education and mentorship. This six-month program provides comprehensive technical and sales skills training through classroom sessions, shadowing, and mentoring by sales and pre-sales leaders and peers. As an Associate Sales Engineer, you will have the chance to familiarize yourself with Snowflake's technology portfolio, understand the needs and business challenges of customers from various industries, and grasp Snowflake's sales process to address them. You will apply your technical aptitude, exceptional communication skills, and creative problem-solving abilities on a daily basis. Upon successful completion of the program, you will join our regional Sales Engineering team and contribute to its success. Upon the successful completion of the training, your responsibilities will include: - Presenting Snowflake technology and vision to executives and technical contributors at prospects and customers - Leveraging knowledge of a domain or industry to align Snowflake's value with the customers" business and technical problems - Working hands-on with SEs, prospects, and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle - Maintaining a deep understanding of competitive and complementary technologies and vendors to position Snowflake effectively - Collaborating with Product Management, Engineering, and Marketing to enhance Snowflake's products and marketing - Providing post-sales technical guidance to the customers" technical team to drive customer utilization of Snowflake and digital transformation success - Contributing to global and regional Sales Engineering initiatives On day one, we expect you to have: - A deep interest in translating customer needs and problems into technical solutions - A passion for technology, a willingness to learn, and the ability to thrive in a fast-paced work environment - Ability to present technical topics to various audiences via whiteboard sessions, presentations, and demos - A university degree in Computer Science, Engineering, Mathematics, or related fields; equivalent experience is preferred - Industry or internship experience focusing on data analytics, pre-sales, solution architecture, or data engineering - Hands-on experience with SQL, Python, Scala, Spark, Java, cloud technology, data platforms, or data analytics (bonus) - A strong desire to pursue a career in Sales Engineering Snowflake is experiencing rapid growth, and we are expanding our team to support and accelerate our development. We are seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a successful future for themselves and Snowflake. Join us and make an impact today! For jobs in the United States, please refer to the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies