Jobs
Interviews

954 Olap Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Your Team Responsibilities About The Role We are seeking an experienced Senior Backend Developer with strong expertise in Java, Spring framework, and high availability service design. This role will be pivotal in designing, developing, and optimizing robust backend systems that power our index and product generation platforms while providing technical leadership within the team. You'll be joining a dynamic team focused on solving complex challenges in delivering near real-time financial data with high throughput and resiliency requirements. About The Team This is an excellent opportunity to join the Index IT team, as part of a delivery-focused IT group responsible for designing, developing and supporting internal, client and public-facing distribution solutions. If selected, you will work as part of a delivery focused and talented software development team responsible for designing, developing and supporting the index and product generation platforms. You will use cutting edge software development techniques and technologies, following the best practices of the industry. Our team solves challenging problems around delivering near real-time financial data, working with large flexible schemas and building database systems that provide exceptional throughput and resiliency. We leverage the latest technologies including Kubernetes, continuous integration/deployment pipelines, and build highly observable applications. MSCI provides a very attractive compensation package, an exciting work environment and opportunities for continuous self-development and career advancement for the right candidates. Your Key Responsibilities Key Responsibilities Design, develop, and maintain scalable, high-performance backend applications using Java and Spring framework Lead the architecture and implementation of complex API services that interact with high availability database systems Develop solutions for processing and delivering near real-time financial data streams Design flexible schemas that can accommodate evolving financial data requirements Collaborate closely with product managers, business analysts, and other developers to translate business requirements into technical solutions Design and optimize OLAP database interactions for analytical performance and high availability Implement observable applications with comprehensive monitoring and logging Design and develop RESTful APIs following industry best practices Lead code reviews and mentor junior developers on team best practices Participate in the full software development lifecycle from requirements analysis through deployment Troubleshoot and resolve complex production issues in high-throughput systems Evaluate and recommend new technologies and approaches to improve system performance and developer productivity Contribute to technical documentation and system design specifications Preferred Qualifications Master's degree in Computer Science, Software Engineering, or related field Experience with Kubernetes and containerized application deployment Experience with observability frameworks such as OpenTelemetry (OTEL) Proficiency with continuous integration and deployment methodologies (CI/CD) Knowledge of cloud platforms (AWS, Azure, or GCP) Experience with microservices architecture Experience with containerization technologies (Docker) Understanding of DevOps practices Experience with message brokers (Kafka, RabbitMQ) Background in agile development methodologies Experience with test-driven development and automated testing frameworks Familiarity with financial data models and structures Background in financial services or experience with financial data Required Qualifications Your skills and experience that will help you excel Bachelor's degree in Computer Science, Information Technology, or related field 7+ years of professional experience in backend software development 5+ years of experience with Java programming and core Java concepts 3+ years of experience with Spring framework (Spring Boot, Spring MVC, Spring Data) Familiarity with OLAP concepts and high availability database design principles Experience building systems that handle large data volumes with high throughput requirements Proficiency in SQL and database optimization techniques Experience with RESTful API design and implementation Solid understanding of design patterns and object-oriented programming Experience with version control systems (Git) Strong problem-solving skills and attention to detail Excellent communication skills to collaborate effectively across teams and explain technical concepts to non-technical stakeholders About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 2 months ago

Apply

2.0 years

27 - 42 Lacs

Pune

Work from Office

The Role We are currently seeking an experienced Backend Software Engineer with a strong Java background to join Addepar in our Partner Platform team! We are building out a new platform from scratch which will enable third parties to simply and safely engage with Addepar at scale. This team is passionate about handling large volumes of data and the engineering challenges in building the distributed systems responsible for automated data ingestion and transformation. We want people who are hard-working and care deeply about solving hard problems at high scale, delighting customers, and participating in the success of the whole company. We look for dedicated engineers with real technical depth and a desire to understand the end business. If you've designed sophisticated scalable systems, have extensive experience with Java and related technologies, or are just interested in tackling complicated technical, critically important problems, join us! What You’ll Do Work in partnership with engineering partners and other platform users to identify requirements and priorities, and map out solutions for challenging technology and workflow problems. Design, develop, and deploy high-quality Java applications that integrate with various data sources and services. Build technical skills in a high-performing team of engineers in India who can design, develop, and deploy Java-based solutions with a focus on backend services and APIs, and help other teams at Addepar build on top of the Addepar platform. Lay a solid foundation of the software architecture for the team in system design and code development with a strong focus on Java and related technologies. Who You Are B.S., or M.S. in Computer Science or similar technical field of study (or equivalent practical experience). 4+ years of software engineering experience. Expert-level proficiency in backend development, with a focus on Java. Good experience on AWS or any other cloud platform. Experience with databases, SQL, NoSQL, OLAP, and/or data lake architectures. A strong ownership mentality and drive to solve the most important problems. Passion for implementing standard processes with a bias toward smart automation. A rapid learner with robust analytical and problem-solving abilities. Comfortable working in a cloud context, with automated infrastructure and service-oriented architecture. Experience with Java, Spring Boot, RESTful APIs, and related technologies is preferred. Practical knowledge of agile practices with an outlook that prioritizes experimentation and iteration combined with an ability to guide teams toward activities and processes that facilitate optimal outcomes.

Posted 2 months ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Data Modeler Role Purpose The Data Modeler plays a vital role in the success of our data analytics initiatives at Wipro Technologies. This role involves not only designing, testing, and maintaining sophisticated software programs tailored for specific operating systems and applications deployed at client sites. The Data Modeler ensures that the final products meet rigorous quality assurance parameters and adhere to industry standards. You will work closely with cross-functional teams to create robust data architectures and drive innovation within our data-driven projects. Your expertise will contribute directly to improved business outcomes and user experiences, establishing you as a key player in our organization. ͏ Location: Chennai (Mandatory) @ Customer location. JD: As a Data Modeler, you are expected to possess hands-on experience in data modeling for both OLTP and OLAP systems, which will enable you to design comprehensive data solutions that cater to diverse business needs. Your in-depth knowledge of Conceptual, Logical, and Physical data modeling will be crucial in ensuring precise data representation and storage. A strong understanding of indexing, partitioning, and data sharding, complemented by practical experience, is essential to optimize database performance and ensure efficient data retrieval. Additionally, your ability to identify and address factors impacting database performance will support near-real-time reporting and enhanced application interaction. Familiarity with at least one data modeling tool, particularly DBSchema, is highly valued. Experience or functional knowledge of the mutual fund industry will provide a competitive edge as our projects often intersect with financial services. Moreover, familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery enriches your toolkit, enabling you to apply cutting-edge cloud technology in data modeling. Please note, your willingness to work from our Chennai office is mandatory for this position, as we prioritize collaboration and on-site teamwork to drive our initiatives forward. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

2.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 55,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision-making and embed analytics more deeply into their business processes. Our global footprint of nearly 8,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Role And Responsibilities Overview Oversee all aspects of quality assurance including establishing and reporting on metrics, applying industry best practices and developing new tools and processes to ensure quality goals are met. Adhere to formal QA processes, ensuring that the Systems Implementation (SI) team is using Industry accepted best Practices. Ensure data accuracy in SSIS, SQL and PBI. Detailed and effective written communication skills documenting the features tested and bugs found. Document necessity of performance optimization, identify the stale & error calculations. Act as key point of contact for all QA aspects of releases, providing test services and coordinating QA resources internally and externally in SQL. Create and Provide Feedback on test cases, scripts, plans and Procedures (manual and automated) responsible for executing them. The entire software development life cycle and test cycles (unit, Regression, Functional, Systems, stress& scale, smoke and sanity) Strong experience in functional testing and test automation. Process breakdown & testing the different scenarios at every step. Diagnose defects and track them from discovery to resolution. Ensure the the SI maintains High quality and accuracy up to software release. Ensure all testing work is carried out in accordance with the testing plan, Including schedule and quality requirement. Collaborates with data warehouse architect, the ETL lead and BI developers in the construction and execution of test scenarios including those applicable to the development, test and production data warehouse environment. Candidate Profile Bachelor’s/Master's degree in computer science/engineering, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply Insurance Industry knowledge preferred. Proficient in SQL queries, sub queries, and complex joins for generating stored procedures. Experience working in PBI, ETL projects and ETL technologies like SQL Server Integration Services (SSIS). Good Understanding of Data Warehouse concepts such as Dimensions and Facts tables, SCD, OLAP, OLTP etc. Experience extracting and manipulating data from relational databases with advanced SQL. 2-5 years of QA experience. Able to test the data ingestion/curation pipelines. Experience with the test artifacts (test strategy, test cases, defect logs, status report) Able to write nonfunctional and negative test cases. Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges. Able to understand cross cultural differences and can work with clients across the globe. Self-Motivated, works well independently and with others. Using metrics-driven approach and closed loop feedback to improve software deliverables and improve predictability and reliability of releases. What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less

Posted 2 months ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Job Location: BLR/MUMBAI/GURGAON/PUNE/CHENNAI/NOIDA/HYDERABAD/COIMBATORE Job Description Fractal is a leading AI & analytics organization. We have a strong Fullstack Team with great leaders accelerating the growth. Our people enjoy a collaborative work environment, exceptional training, and career development as well as unlimited growth opportunities. We have a Glassdoor rating of 4/5 and achieve customer NPS of 9/10. If you like working with a curious, supportive, high-performing team, Fractal is the place for you. Responsibilities As a Fullstack (React and Python) or (Java and React) Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. You would report to a Senior Fullstack Engineer and will be responsible for - Managing, developing & maintaining the backend and frontend for various Data to Decision projects for our Fortune 500 client Work closely with the data science & engineering team to integrate the algorithmic output from the backend REST APIs Work closely with business and product owners to create dynamic infographics with intuitive user controls Participate in UAT, and diagnose & troubleshoot, bugs and application integration issues Create and maintain documentation related to the developed processes and applications Qualifications REQUIRED QUALIFICATIONS: 4-7 years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with JavaScript (ES6), HTML5 & CSS Expert-level proficiency with ReactJS or VueJS Expert-level proficiency with Node.js Expert-level proficiency with Python (3.4+), Django (2.1+) or Flask Or Java Familiarity with common databases (RDBMS such as MySQL & NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design Hands-on experience with test driven development, using testing libraries like Jest, PyTest and Nose Familiarity with common JS visualization libraries built using D3, Chart.js, Highcharts, etc. Deep understanding of core backend concepts: Develop and design RESTful services and APIs Develop functional databases, applications, and servers to support websites on the back end Performance optimization and multithreading concepts Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as Github Preferred Qualifications Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Experience working with UX designers and bringing design to life Experience with Microservices, Messaging Brokers (e.g., RabbitMQ) Experience with reverse proxy engines such as Nginx, Apache HTTPD Familiarity with Github Actions or any other CI/CD tool (e.g., Jenkins) Education B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary Senior Full Stack Developer with Azure Cloud experience: Required Skills: Experience: Atleast 5+ years in full stack software development, with proficiency in the following areas: Frontend: React (v18+), Redux, HTML5, CSS3, JavaScript/TypeScript, Angular JS Backend: Node.js (with Express), MongoDB, MYSQL and RESTful API development Cloud: Azure and Experience with AWS services DevOps Practices: Familiarity with modern practices, including Continuous Integration/Continuous Deployment (CI/CD) and version control using Git Security Awareness: Knowledge of security best practices, including authentication, authorization, and data encryption (JWT, OAuth, etc.) Testing Expertise: Strong skills in testing and Test Driven Development (TDD) Critical Thinking: Excellent critical thinking skills with the ability to architect scalable and maintainable solutions Leadership: Experience in guiding technical decisions Communication: Outstanding communication skills, with the ability to explain complex concepts to non technical team members Collaboration: A strong drive to work collaboratively with cross functional teams, including product owners, and developers Bonus Qualifications: Familiarity with automated testing frameworks, such as Cypress, Selenium and Appium Experience with real time data applications (Socket.io, WebSockets) Knowledge of AI tools / OpenAI API, GROK ET AL Proficiency In following skills: ¿ Front End Development: Knowledge of HTML, CSS, and JavaScript. ¿ JavaScript Frameworks: Proficiency in JavaScript frameworks like React.js, Angular.js. ¿ Back End Programming Languages: Java, Python, Node.js. ¿ Databases: MySQL, Elastic, Cosmos DB, Oracle ¿ Version Control Systems: Git, JFROG ¿ Web Server Technologies: Apache, Nginx ¿ HTTP and REST: HTTP and RESTful design principles for APIs ¿ Web Application Architecture: application scalable and maintainable. ¿ Testing and Debugging: Knowledge of testing frameworks like Mocha and debugging processes. ¿ Security: Awareness of security concerns and how to prevent common security threats. ¿ Data Structures and Algorithms ¿ Deployment and DevOps: Docker, Jenkins ¿ Azure: Deep understanding of cloud service platform such as Azure ¿ Infrastructure as Code (IaC): Terraform ¿ Networking: Knowledge of networking topics, particularly in a cloud context, including DNS, TCP/IP, HTTP/S, VPNs, and firewalls. ¿ Proficiency in Power BI: Deep understanding of Power BI Desktop, Power BI Service, and the ability to design compelling dashboards and reports using these tools. ¿ Data Modeling: Ability to create and understand data models, including the relationships between different data sets. ¿ DAX and M Queries: Proficiency in Data Analysis Expressions (DAX) and M language to manipulate data and create complex calculations. ¿ SQL Knowledge: Strong SQL skills are often required to write queries, create views, stored procedures, and manipulate databases. ¿ Data Visualization: Strong skills in data visualization and the ability to choose the right visuals based on the data and the business requirements. ¿ Data Analysis: Ability to analyze data and draw out insights that can be represented through reports and dashboards. ¿ Understanding of ETL Processes: Knowledge of Extract, Transform, Load (ETL) processes to prepare data for use in Power BI. ¿ Data Warehousing Concepts: Understanding of data warehousing concepts such as star schema, snowflake schema, and OLAP cube. ¿ Soft Skills: Good communication skills, problem solving ability, and a lifelong learning attitude. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Himachal Pradesh, India

Remote

As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasn’t changed — we’re here to stop breaches, and we’ve redefined modern security with the world’s most advanced AI-native platform. We work on large scale distributed systems, processing almost 3 trillion events per day. We have 3.44 PB of RAM deployed across our fleet of C* servers - and this traffic is growing daily. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. We’re also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. We’re always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you. About The Role The charter of the Data + ML Platform team is to harness all the data that is ingested and cataloged within the Data LakeHouse for exploration, insights, model development, ML Engineering and Insights Activation. This team is situated within the larger Data Platform group, which serves as one of the core pillars of our company. We process data at a truly immense scale. Our processing is composed of various facets including threat events collected via telemetry data, associated metadata, along with IT asset information, contextual information about threat exposure based on additional processing, etc. These facets comprise the overall data platform, which is currently over 200 PB and maintained in a hyper scale Data Lakehouse, built and owned by the Data Platform team. The ingestion mechanisms include both batch and near real-time streams that form the core Threat Analytics Platform used for insights, threat hunting, incident investigations and more. As an engineer in this team, you will play an integral role as we build out our ML Experimentation Platform from the ground up. You will collaborate closely with Data Platform Software Engineers, Data Scientists & Threat Analysts to design, implement, and maintain scalable ML pipelines that will be used for Data Preparation, Cataloging, Feature Engineering, Model Training, and Model Serving that influence critical business decisions. You’ll be a key contributor in a production-focused culture that bridges the gap between model development and operational success. Future plans include generative AI investments for use cases such as modeling attack paths for IT assets. What You’ll Do Help design, build, and facilitate adoption of a modern Data+ML platform Modularize complex ML code into standardized and repeatable components Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines Review code changes from data scientists and champion software development best practices Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment What You’ll Need B.S. in Computer Science, Data Science, Statistics, Applied Mathematics, or a related field and 7 + years related experience; or M.S. with 5+ years of experience; or Ph.D with 6+ years of experience. 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc. Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable. Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.) Production experience with infrastructure-as-code tools such as Terraform, FluxCD Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools Expert level experience with CI/CD frameworks such as GitHub Actions Expert level experience with containerization frameworks Strong analytical and problem solving skills, capable of working in a dynamic environment Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes. Experience With The Following Is Desirable Go Iceberg Pinot or other time-series/OLAP-style database Jenkins Parquet Protocol Buffers/GRPC VJ1 Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role Employee Resource Groups, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified™ across the globe CrowdStrike is proud to be an equal opportunity employer. We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed. We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment. The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law. We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less

Posted 2 months ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Who We Are Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 50 countries, Addepar’s platform aggregates portfolio, market and client data for over $7 trillion in assets. Addepar’s open platform integrates with more than 100 software, data and services partners to deliver a complete solution for a wide range of firms and use cases. Addepar embraces a global flexible workforce model with offices in Silicon Valley, New York City, Salt Lake City, Chicago, London, Edinburgh, Pune, and Dubai. The Role We are currently seeking an experienced Backend Software Engineer with a strong Java background to join Addepar in our Partner Platform team! We are building out a new platform from scratch which will enable third parties to simply and safely engage with Addepar at scale. This team is passionate about handling large volumes of data and the engineering challenges in building the distributed systems responsible for automated data ingestion and transformation. We want people who are hard-working and care deeply about solving hard problems at high scale, delighting customers, and participating in the success of the whole company. We look for dedicated engineers with real technical depth and a desire to understand the end business. If you've designed sophisticated scalable systems, have extensive experience with Java and related technologies, or are just interested in tackling complicated technical, critically important problems, join us! What You’ll Do Work in partnership with engineering partners and other platform users to identify requirements and priorities, and map out solutions for challenging technology and workflow problems. Design, develop, and deploy high-quality Java applications that integrate with various data sources and services. Build technical skills in a high-performing team of engineers in India who can design, develop, and deploy Java-based solutions with a focus on backend services and APIs, and help other teams at Addepar build on top of the Addepar platform. Lay a solid foundation of the software architecture for the team in system design and code development with a strong focus on Java and related technologies. Who You Are B.S., or M.S. in Computer Science or similar technical field of study (or equivalent practical experience). 4+ years of software engineering experience. Expert-level proficiency in backend development, with a focus on Java. Good experience on AWS or any other cloud platform. Experience with databases, SQL, NoSQL, OLAP, and/or data lake architectures. A strong ownership mentality and drive to solve the most important problems. Passion for implementing standard processes with a bias toward smart automation. A rapid learner with robust analytical and problem-solving abilities. Comfortable working in a cloud context, with automated infrastructure and service-oriented architecture. Experience with Java, Spring Boot, RESTful APIs, and related technologies is preferred. Practical knowledge of agile practices with an outlook that prioritizes experimentation and iteration combined with an ability to guide teams toward activities and processes that facilitate optimal outcomes. Our Values Act Like an Owner - Think and operate with intention, purpose and care. Own outcomes. Build Together - Collaborate to unlock the best solutions. Deliver lasting value. Champion Our Clients - Exceed client expectations. Our clients’ success is our success. Drive Innovation - Be bold and unconstrained in problem solving. Transform the industry. Embrace Learning - Engage our community to broaden our perspective. Bring a growth mindset. In addition to our core values, Addepar is proud to be an equal opportunity employer. We seek to bring together diverse ideas, experiences, skill sets, perspectives, backgrounds and identities to drive innovative solutions. We commit to promoting a welcoming environment where inclusion and belonging are held as a shared responsibility. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. PHISHING SCAM WARNING: Addepar is among several companies recently made aware of a phishing scam involving con artists posing as hiring managers recruiting via email, text and social media. The imposters are creating misleading email accounts, conducting remote “interviews,” and making fake job offers in order to collect personal and financial information from unsuspecting individuals. Please be aware that no job offers will be made from Addepar without a formal interview process. Additionally, Addepar will not ask you to purchase equipment or supplies as part of your onboarding process. If you have any questions, please reach out to TAinfo@addepar.com. Show more Show less

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

& Summary . Why PWC Learn more about us . & Summary We are looking for a seasoned Azure data Engineer s trong Proficiency on PySpark and SQL. Understanding the nature of OLAP and OLTP architecture . Understanding the Medallion architecture . Strong Proficiency on Databricks notebook, Databricks Job, DLT Streaming table, DLT Materialized View, DLT Pipeline. Mandatory skill sets Proficiency on Databricks Unity Catalog. Strong Proficiency on Azure Data lake . Knowledge on Azure Data Factory Strong proficiency on Amazon S3 Strong proficiency on AWS Glue Knowledge on Amazon Kinesis Preferred skill sets Sound knowledge on PowerShell and CI/CD pipeline. Good knowledge in python backend Years of experience required 4+ Education qualification BE/ B.Tech /MBA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Angular Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment, Performance Management Software {+ 16 more} Travel Requirements Government Clearance Required?

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Where: Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office) Job Description Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements. Create an inventory of the data necessary to build and implement a data architecture. Envision data pipelines and how data will flow through the data landscape. Evaluate current data management technologies and what additional tools are needed. Determine upgrades and improvements to current data architectures. Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems. Build data models for database structures, analytics, and use cases. Develop and enforce database development standards with solid DB/ Query optimizations capabilities. Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery. Research new opportunities and create methods to acquire data. Develop measures that ensure data accuracy, integrity, and accessibility. Continually monitor, refine, and report data management system performance. Required Qualifications And Skillset Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one) Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must) Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2) Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required) Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development. Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications. Strong grip on programming using Python and PySpark. Clear understanding of data best practices prevailing in the industry. Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice) Strong networking and data security experience. Awareness Of The Following Application development understanding (Full Stack) Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc. Good understanding of Analytics Platform Landscape that includes AI/ML Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc. About Us Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards. We Offer You a chance to try new things & take risks. meaningful problems you'll be proud to solve. people you will be comfortable working with. transparent and innovative work environment. To know more about us visit Gramener Website and Gramener Blog. If anyone looking for the same, kindly share below mentioned details. Total Experience Relevant Experience: Ctct Notice Period: Ectc Current Location: Skills:- OLAP, Microsoft Azure, Architecting Show more Show less

Posted 2 months ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Vadodara

Work from Office

Job Description We are looking for a highly capable Senior Full Stack engineer to be a core contributor in developing our suite of product offerings. If you love working on complex problems, and writing clean code, you will love this role. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze semi-structured data from different sources (20 million+ products from 500+ websites into our catalog of 500 million+ products). We help our customers discover new patterns in their data that can be leveraged so that they can become more competitive and increase their revenue. Essential Functions: Think like our customers - you will work with product and engineering leaders to define intuitive solutions Designing customer-facing UI and back-end services for various business processes. Developing high-performance applications by writing testable, reusable, and efficient code. Implementing effective security protocols, data protection measures, and storage solutions. Improve the quality of our solutions - you will hold yourself and your team members accountable to writing high quality, well-designed, maintainable software Own your work - you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table - some of our best innovations originate within the team Guiding and mentoring others on the team Technologies We Use: Languages: NodeJS/NestJS/Typescript, SQL, React/Redux, GraphQL Infrastructure: AWS, Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Databases: Postgres, MongoDB, Redis, Elasticsearch, Trino, Iceberg Streaming and Queuing: Kafka, NATS, Keda Qualifications 6+ years of professional software engineering/development experience. Proficiency with architecting and delivering solutions within a distributed software platform Full stack engineering

Posted 2 months ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role The Business Intelligence analyst will support the ongoing design and development of dashboards, reports, and other analytics studies or needs. To be successful in the role you’ll need to be intellectually curious, detail-oriented, open to new ideas, and possess data skills and a strong aptitude for quantitative methods. The role requires strong SQL skills a wide experience using BI visualization tools like Tableau and PowerBI Your Role Accountabilities With the support of other analysis and technical teams, collect and analyze stakeholders’ requirements. Responsible for developing interactive and user-friendly dashboards and reports, partnering with UI/UX designers. Be experienced in BI tools like powerBi, Tableau, Looker, Microstrategy and Business Object and be capable and eager to learn new and other tools Be able to quickly shape data into reporting and analytics solutions Work with the Data and visualization platform team on reporting tools actualizations, understanding how new features can benefit our stakeholders in the future, and adapting existing dashboards and reports Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more Qualifications & Experiences 2+ years of experience working with BI tools or any data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP) Skills in BI tools and BI systems, such as Power BI, SAP BO, Tableau, Looker, Microstrategy, etc., creating data-rich dashboards, implementing Row-level Security (RLS) in Power BI, writing DAX expressions, developing custom BI products with scripting and programming languages such as R, Python, etc. In-depth understanding and experience with BI stacks The ability to drill down on data and visualize it in the best possible way through charts, reports, or dashboards Self-motivated and eager to learn Ability to communicate with business as well as technical teams Strong client management skills Ability to learn and quickly respond to rapidly changing business environment Have an analytical and problem-solving mindset and approach Not Required But Preferred Experience BA/BS or MA/MS in design related field, or equivalent experience (relevant degree subjects include computer science, digital design, graphic design, web design, web technology) Understanding of software development architecture and technical aspects How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less

Posted 2 months ago

Apply

4.0 - 14.0 years

22 - 25 Lacs

Pune

Work from Office

Join us as a Snr Developer at Barclays, where youll take part in the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. Youll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Snr Developer you should have experience with: Minimum Qualification - B. E. / B. Tech or equivalent. 10+ years of experience in database development in banking domain Expert in Oracle, SQL Server Proficiency in SQL Experience in data migration tools Optimize OLTP/OLAP and operational DB systems for Valpre and CLM. Lead database design for all customer lifecycle management services Implement migration strategies Implement data masking/encryption for PII under PCI DSS and GDPR Mentor junior developers in DB best practices Establish CI/CD pipelines with audit trails Collaborate with Data Architect, Data Analyst and Data Modelers Some other highly valued skills include: Effective and Efficient stakeholder management. Good Communication Skills. Good working knowledge and hands-on experience of workflow application and business rules engine. Good Knowledge of Banking Domain. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 2 months ago

Apply

4.0 years

6 - 9 Lacs

Hyderābād

On-site

About Citco Citco is a global leader in fund services, corporate governance and related asset services with staff across 80 offices worldwide. With more than $1.7 trillion in assets under administration, we deliver end-to-end solutions and exceptional service to meet our clients’ needs. For more information about Citco, please visit www.citco.com About the Team & Business Line: Citco Fund Services is a division of the Citco Group of Companies and is the largest independent administrator of Hedge Funds in the world. Our continuous investment in learning and technology solutions means our people are equipped to deliver a seamless client experience. This position reports in to the Loan Services Business Line As a core member of our Loan Services Data and Reporting team, you will be working with some of the industry’s most accomplished professionals to deliver award-winning services for complex fund structures that our clients can depend upon Job Duties in Brief: Your Role: Develop and execute database queries and conduct data analyses Create scripts to analyze and modify data, import/export scripts and execute stored procedures Model data by writing SQL queries/Python codes to support data integration and dashboard requirements Develop data pipelines that provide fast, optimized, and robust end-to-end solutions Leverage and contribute to design/building relational database schemas for analytics. Handle and manipulate data in various structures and repositories (data cube, data mart, data warehouse, data lake) Analyze, implement and contribute to building of APIs to improve data integration pipeline Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Perform data integration through extracting, transforming and loading (ETL) data from various sources. Identify opportunities to improve processes and strategies with technology solutions and identify development needs in order to improve and streamline operations Create tabular reports, matrix reports, parameterized reports, visual reports/dashbords in a reporting application such as Power BI Desktop/Cloud or QLIK Integrating PBI/QLIK reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation is also an advantage Implementation of NLP techniques for text representation, semantic extraction techniques, data structures and modelling Contribute to deployment and maintainence of machine learning solutions in production environments Building and Designing cloud applications using Microsoft Azure/AWS cloud technologies. About You: Background / Qualifications Bachelor’s Degree in technology/related field or equivalent work experience 4+ Years of SQL and/or Python experience is a must Strong knowledge of data concepts and tools and experienced in RDMS such as MS SQL Server, Oracle etc. Well-versed with concepts and techniques of Business Intelligence and Data Warehousing. Strong database designing and SQL skills. objects development, performance tuning and data analysis In-depth understanding of database management systems, OLAP & ETL frameworks Familiarity or hands on experience working with REST or SOAP APIs Well versed with concepts for API Management and Integration with various data sources in cloud platforms, to help with connecting to traditional SQL and new age data sources, such as Snowflake Familiarity with Machine Learning concepts like feature selection/deep learning/AI and ML/DL frameworks (like Tensorflow or PyTorch) and libraries (like scikit-learn, StatsModels) is an advantage Familiarity with BI technologies (e.g. Microsoft Power BI, Oracle BI) is an advantage Hands-on experience at least in one ETL tool (SSIS, Informatica, Talend, Glue, Azure Data factory) and associated data integration principles is an advantage Minimum 1+ year experience with Cloud platform technologies (AWS/Azure), including Azure Machine Learning is desirable. Following AWS experience is a plus: Implementing identity and access management (IAM) policies Managing user accounts with IAM Knowledge of writing infrastructure as code (IaC) using CloudFormation or Terraform. Implementing cloud storage using Amazon Simple Storage Service (S3) Experience with serverless approaches using AWS Lambda, e.g. AWS (SAM) Configuring Amazon Elastic Compute Cloud (EC2) Instances Previous Work Experience: Experience querying databases and strong programming skills: Python, SQL, PySpark etc. Prior experience supporting ETL production environments & web technologies such as XML is an advatange Previous working experience on Azure Data Services including ADF, ADLS, Blob, Data Bricks, Hive, Python, Spark and/or features of Azure ML Studio, ML Services and ML Ops is an advantage Experience with dashboard and reporting applications like Qlik, Tableau, Power BI Other: Well rounded individual possessing a high degree of initiative Proactive person willing to accept responsibility with very little hand-holding A strong analytical and logical mindset Demonstrated proficiency in interpersonal and communication skills including oral and written English. Ability to work in fast paced, complex Business & IT environments Knowledge of Loan Servicing and/or Loan Administration is an advantage Understanding of Agile/Scrum methodology as it relates to the software development lifecycle What We Offer: A rewarding and challenging environment that spans multiple geographies and multiple business lines Great working environment, competitive salary and benefits, and opportunities for educational support Be part of an industry leading global organisation, renowned for excellence Opportunities for personal and professional career development Our Benefits Your well-being is of paramount importance to us, and central to our success. We provide a range of benefits, training and education support, and flexible working arrangements to help you achieve success in your career while balancing personal needs. Ask us about specific benefits in your location. We embrace diversity, prioritizing the hiring of people from diverse backgrounds. Our inclusive culture is a source of pride and strength, fostering innovation and mutual respect. Citco welcomes and encourages applications from people with disabilities. Accommodations are available upon request for candidates taking part in all aspects of the selection .

Posted 2 months ago

Apply

10.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

3 – 10 years of Python data engineering experience Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc., on AZURE. Experience in developing ETL, OLAP based and Analytical Applications. Experience in ingesting batch and streaming data from various data sources. Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.) Ability to quickly learn and develop expertise in existing highly complex applications and architectures. Experience in Azure Data Factory (ADF), Azure Data Brick, Azure Synapse, Azure SQL, Azure DWH, Azure Data Lake Storage / Analytics, Azure Database Migration Services Experience of DevOps and CD/CD tools. Familiarity with Rest APIs Clear and precise communication skills Experience with CI/CD pipelines, branching strategies, & GIT for code management Bachelor's degree in computer science, information technology, or a similar field. You will need to be well spoken and have an easy time establishing productive long lasting working relationships with a large variety of stakeholders Take the lead on data pipeline design with strong analytical skills and a keen eye for detail to really understand and tackle the challenges businesses are facing You will be confronted with a large variety of Data Engineer tools and other new technologies as well with a wide variety of IT, compliance, security related issues. Design and develop world-class technology solutions to solve business problems across multiple client engagements. Collaborate with other teams to understand business requirements, client infrastructure, platforms and overall strategy to ensure seamless transitions. Work closely with AI and A team to build world-class solutions and to define AI strategy. You will possess strong logical structuring and problem-solving skills with expert level understanding of database and have an inherent desire to turn data into actions. Strong verbal, written and presentation skills Show more Show less

Posted 2 months ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 1 Opening Kochi, Trivandrum Role description Job Title: Business Intelligence Developer Location: Bangalore Experience: Minimum 4 years of relevant experience Notice Period: Maximum 30 days Location :Trivandrum ,Cochin Job Description We are seeking a skilled and experienced Business Intelligence (BI) Developer with over 6 years of total experience, including a minimum of 4 years in relevant BI technologies. The ideal candidate should be capable of independently handling BI development, reporting, and data modeling tasks, and comfortable interacting with C-suite executives and stakeholders. Key Responsibilities Develop and maintain BI applications using SQL Server, Salesforce (SF), Google Cloud Platform (GCP), PostgreSQL, Power BI , and Tableau . Strong understanding and hands-on experience with Data Modeling concepts: Dimensional & Relational modeling Star Schema and Snowflake Schema Fact and Dimension tables Translate complex business requirements into functional and non-functional technical specifications. Collaborate effectively with business stakeholders and leadership. Write optimized T-SQL , create User Defined Functions, Triggers, Views, Temporary Tables , and implement constraints and indexes using DDL/DML. Design and develop SSAS OLAP Cubes and write complex DAX expressions . Proficient in external tools like Tabular Editor and DAX Studio . Customize and optimize Stored Procedures and complex SQL Queries for data extraction and transformation. Implement Incremental Refresh , Row-Level Security (RLS) , Parameterization , Dataflows , and Data Gateways in Power BI. Develop and maintain SSRS and Power BI reports. Optimize Power BI performance using Mixed Mode and Direct Query configurations. Key Skills Power BI Power BI Tools (DAX, Dataflows, etc.) Data Analysis Microsoft Fabric Skills Power Bi,Power Tools,Data Analysis,Fabric About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 months ago

Apply

0 years

4 - 8 Lacs

Gurgaon

On-site

Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant – Azure Data Engineer! In this role, We are looking for an experienced Azure data engineer to design, develop, and maintain scalable data solutions on Microsoft Azure. The ideal candidate will have a strong background in data engineering, cloud technologies, and analytics, with expertise in building robust data pipelines and optimizing data workflows. This role requires collaboration with cross-functional teams to deliver high-quality solutions that meet business needs. Responsibilities Design and implement scalable data pipelines and ETL processes using Azure Data Factory, Databricks, and other Azure services. Develop and optimize data storage solutions, including Azure Data Lake, Azure SQL Database, Snowflake, SQL and Python. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure data quality, security, and compliance by implementing best practices for governance and monitoring. Perform performance tuning and optimization of data workflows and queries. Create documentation for data architecture, processes, and workflows to ensure knowledge sharing across teams. Work closely with analytics teams to enable advanced reporting and machine learning capabilities. Qualifications we seek in you Minimum Qualifications Bachelor’s degree in computer science, Information Technology, or a related field. Strong experience working in Azure data services, data engineering & transformation projects. Strong understanding of data warehousing, ETL processes, OLAP concepts and data modelling concepts Strong communication and collaboration abilities. Experience on Agile way of working Mandatory Skills: Proficiency in Azure services such as Azure Data Factory, Azure Data Lake and Databricks. Strong programming skills in Python, SQL for data processing and transformation. Experience with ETL/ELT processes and designing data pipelines. Experience in Snowflake Knowledge of cloud architecture principles and best practices for scalability and security. Familiarity with CI/CD pipelines and DevOps practices for deploying data solutions. Preferred Qualifications/ Skills Experience in Banking projects using Azure. Certification in Microsoft Azure (such as Azure Data Engineer Associate). Understanding of Power BI or other visualization tools for reporting purposes. Strong problem-solving skills and ability to work in a fast-paced environment. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Gurugram Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 4, 2025, 1:32:35 AM Unposting Date Jul 4, 2025, 1:29:00 PM Master Skills List Digital Job Category Full Time

Posted 2 months ago

Apply

0 years

0 Lacs

Chennai

On-site

Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 months ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities ETL & BI Testing: Manage the testing of ETL processes, data pipelines, and BI reports to ensure accuracy and reliability. Develop and execute test strategies, test plans, and test cases for data validation. Perform data reconciliation, transformation validation, and SQL-based testing to ensure data correctness. Validate reports and dashboards built using BI tools (Power BI, Tableau). Automate ETL testing where applicable using Python, Selenium, or other automation tools. Identify and log defects, track issues, and ensure timely resolution. Collaborate with business stakeholders to understand data requirements and reporting needs. Assist in documenting functional and non-functional requirements for data transformation and reporting. Support data mapping, data profiling, and understanding business rules applied to datasets. Participate in requirement-gathering sessions and provide inputs on data validation needs. Required Skills & Experience 6–8 years of experience in ETL, Data Warehouse, and BI testing. Strong experience with SQL, data validation techniques, and database testing. Hands-on experience with ETL tools (Informatica, Talend, SSIS, or similar). Proficiency in BI tools like Power BI, Tableau for report validation. Good knowledge of data modeling, star schema, and OLAP concepts. Mentor a team of ETL/BI testers and provide guidance on testing best practices. Coordinate with developers, BAs, and business users to ensure end-to-end data validation. Define QA processes, best practices, and automation strategies to improve testing efficiency. Experience in data reconciliation, transformation logic validation, and data pipeline testing. Experience in Insurance domain is an added advantage. Automation skills for data and report testing (Python, Selenium, or ETL testing frameworks) are a plus. Experience in understanding and documenting business & data requirements. Ability to work with business users to gather and analyze reporting needs. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Experience with Agile/Scrum methodologies and working in a cross-functional team. Preferred Qualifications Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. ISTQB or equivalent certification in software testing is preferred. Show more Show less

Posted 2 months ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Job Title: Business Intelligence Developer Location: Bangalore Experience: Minimum 4 years of relevant experience Notice Period: Maximum 30 days Location :Trivandrum ,Cochin Job Description We are seeking a skilled and experienced Business Intelligence (BI) Developer with over 6 years of total experience, including a minimum of 4 years in relevant BI technologies. The ideal candidate should be capable of independently handling BI development, reporting, and data modeling tasks, and comfortable interacting with C-suite executives and stakeholders. Key Responsibilities Develop and maintain BI applications using SQL Server, Salesforce (SF), Google Cloud Platform (GCP), PostgreSQL, Power BI, and Tableau. Strong understanding and hands-on experience with Data Modeling concepts: Dimensional & Relational modeling Star Schema and Snowflake Schema Fact and Dimension tables Translate complex business requirements into functional and non-functional technical specifications. Collaborate effectively with business stakeholders and leadership. Write optimized T-SQL, create User Defined Functions, Triggers, Views, Temporary Tables, and implement constraints and indexes using DDL/DML. Design and develop SSAS OLAP Cubes and write complex DAX expressions. Proficient in external tools like Tabular Editor and DAX Studio. Customize and optimize Stored Procedures and complex SQL Queries for data extraction and transformation. Implement Incremental Refresh, Row-Level Security (RLS), Parameterization, Dataflows, and Data Gateways in Power BI. Develop and maintain SSRS and Power BI reports. Optimize Power BI performance using Mixed Mode and Direct Query configurations. Key Skills Power BI Power BI Tools (DAX, Dataflows, etc.) Data Analysis Microsoft Fabric Skills Power Bi,Power Tools,Data Analysis,Fabric Show more Show less

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

This role is for one of the Weekday's clients Min Experience: 8 years Location: Bangalore, Mumbai JobType: full-time We are seeking a highly experienced and motivated Lead Data Engineer to join our data engineering team. This role is perfect for someone with 8-10 years of hands-on experience in designing and building scalable data infrastructure, data pipelines, and high-performance data platforms. You will lead a team of engineers, set data engineering standards, and work cross-functionally with data scientists, analysts, and software engineers to enable a data-driven culture within the organization. Requirements Key Responsibilities: Technical Leadership: Lead the design and development of robust, scalable, and high-performance data architectures, including batch and real-time data pipelines using modern technologies. Data Pipeline Development: Architect, implement, and maintain complex ETL/ELT workflows using tools like Apache Airflow, Spark, Kafka, or similar. Data Warehouse Management: Design and maintain cloud-based data warehouses and data lakes (e.g., Snowflake, Redshift, BigQuery, Delta Lake), ensuring optimized storage and query performance. Data Quality and Governance: Implement data validation, monitoring, and governance processes to ensure data accuracy, completeness, and security across all platforms. Collaboration: Work closely with stakeholders, including business analysts, data scientists, and application developers, to understand data needs and deliver effective solutions. Mentorship and Team Management: Guide and mentor junior and mid-level data engineers, foster best practices in code, architecture, and agile delivery. Automation and CI/CD: Develop and manage data pipeline deployment processes using DevOps and CI/CD principles. Required Skills & Qualifications: 8-10 years of proven experience in data engineering or a related field. Strong programming skills in Python, Scala, or Java. Expertise in building scalable and fault-tolerant ETL/ELT processes using frameworks such as Apache Spark, Kafka, Airflow, or similar. Hands-on experience with cloud platforms (AWS, GCP, or Azure) and tools like S3, Redshift, Snowflake, BigQuery, Glue, EMR, or Databricks. In-depth understanding of relational and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.). Strong SQL skills with the ability to write complex and optimized queries. Familiarity with data modeling, data warehousing concepts, and OLAP/OLTP systems. Experience in deploying data services using containerization (Docker, Kubernetes) and CI/CD tools like Jenkins, GitHub Actions, or similar. Excellent communication skills with a collaborative and proactive attitude. Preferred Qualifications: Experience working in fast-paced, agile environments or startups. Exposure to machine learning pipelines, MLOps, or real-time analytics. Familiarity with data governance frameworks and data privacy regulations (GDPR, CCPA) Show more Show less

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines Amazon’s Consumer Payments organization is seeking a highly quantitative, experienced Data Engineer to drive growth through analytics, automation of data pipelines, and enhancement of self-serve experiences. . You will succeed in this role if you are an organized self-starter who can learn new technologies quickly and excel in a fast-paced environment. In this position, you will be a key contributor and sparring partner, developing analytics and insights that global executive management teams and business leaders will use to define global strategies and deep dive businesses. You will be part the team that is focused on acquiring new merchants from around the world to payments around the world. The position is based in India but will interact with global leaders and teams in Europe, Japan, US, and other regions. You should be highly analytical, resourceful, customer focused, team oriented, and have an ability to work independently under time constraints to meet deadlines. You will be comfortable thinking big and diving deep. A proven track record in taking on end-to-end ownership and successfully delivering results in a fast-paced, dynamic business environment is strongly preferred. Responsibilities include but not limited to: - Design, develop, implement, test, and operate large-scale, high-volume, high-performance data structures for analytics and Reporting. - Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, AWS – Redshift, and OLAP technologies, Model data and metadata for ad hoc and pre-built reporting. - Work with product tech teams and build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. - Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. - Interface with business customers, gathering requirements and delivering complete reporting solutions. - Collaborate with Analysts, Business Intelligence Engineers and Product Managers to implement algorithms that exploit rich data sets for statistical analysis, and machine learning. - Participate in strategic & tactical planning discussions, including annual budget processes. - Communicate effectively with product/business/tech-teams/other Data teams. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 months ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of Technical Architect to Coupa: Coupa's Professional Services Teams collaborate with our Customers, Partners and internal Product Management to implement the most valuable solutions for our customers. We are now looking to add a knowledgeable Technical Architect with experience of working with integration technologies in the Procurement, Supply Chain and/or AP Automation space to our team in EMEA to manage the successful delivery of integration projects at Coupa. This is an outstanding opportunity to join a high growth organisation in a key role where you can make an impact and fuel your career development. What You’ll Do: Lead customer expectations in collaboration with Coupa and customer project managers to ensure timely delivery and adherence to quality standards Engage with customer architecture teams and senior leadership to address integration requirements in both individual and team settings, onsite and remotely Design integration strategies for data extraction and consumption across Coupa in multi-ERP environments, and build infrastructure to support data loading from various sources Configure, develop, and troubleshoot RESTful APIs and flat-file integrations, while identifying and resolving integration-related issues efficiently Mentor and guide integration engineers from both Coupa and partner teams to ensure successful solution delivery Collaborate with Solution Architecture teams to enhance best practices and standardized implementation methodologies Contribute to the learning and development of professional services and delivery teams by supporting the learning experience function and helping build consultant capabilities What will you bring to Coupa: 2+ years of professional experience with hands-on expertise architecting large-scale ERP integrations (SAP, PeopleSoft, Oracle E-Business Suite, NetSuite), including Finance and Procurement domains like Procure to Order, Procure to Pay, Expenses, and Accounts Payable; exposure to supply chain planning systems is a plus Strong knowledge of ERP domains (SAP, PeopleSoft, Oracle), web technologies, Single Sign-On, and cloud platforms like AWS; proficient with sFTP, RESTful APIs, and SOAP APIs Skilled in data architecture and processing, including both OLAP and OLTP, and experienced in Linux server administration within virtualized environments Middleware expertise with platforms such as IBM, TIBCO, SAP, Oracle, Boomi, or Talend, and integration standards like EDI and cXML Proficient in programming languages (Ruby, Java, .NET), scripting (Python, PowerShell), big data tools (Hadoop, Spark, Kafka), and databases (SQL Server, PostgreSQL, MongoDB, Cassandra, etc.) Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies