Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 8 years of experience as a Power BI Developer with at least 7 years to 12 years of total experience. Your role will involve hands-on experience in handling teams and clients. You should possess expert knowledge in using advanced calculations in MS Power BI Desktop, including DAX languages such as Aggregate, Date, Logical, String, and Table functions. Prior experience in connecting Power BI with both on-premise and cloud computing platforms is required. A deep understanding and the ability to utilize and explain various aspects of relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards are essential for this role. You should have a very good understanding of Data Modeling Techniques for Analytical Data, including Facts, Dimensions, and Measures. Experience in data warehouse design, specifically dimensional modeling, and data mining will be beneficial for this position. Additionally, hands-on experience in SSIS, SSRS, and SSAS will be considered a plus.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will join our data engineering and business intelligence team as an SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) Developer. Your primary responsibilities will include designing, developing, deploying, and maintaining SSIS packages for ETL processes and managing SSAS cubes for advanced analytics and reporting. Collaboration with business analysts, data architects, and stakeholders to grasp data requirements will be essential. You will need to optimize existing ETL processes for improved performance, scalability, and reliability. Additionally, creating and maintaining technical documentation, monitoring ETL workflows, troubleshooting issues, implementing data quality checks, and performing data validation and unit testing are crucial tasks. Integration of SSIS/SSAS with reporting tools like Power BI, Excel, and participation in code reviews, sprint planning, and agile development are part of your responsibilities. A Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 3 years of hands-on experience with SSIS and SSAS is required. Strong proficiency in SQL Server, T-SQL, and building both Multidimensional and Tabular SSAS models is necessary. A deep understanding of data warehousing concepts, star/snowflake schema, ETL best practices, and performance tuning in SSIS and SSAS is expected. Proficiency in data visualization tools such as Power BI or Excel (PivotTables) is preferred. Experience with Azure Data Factory, Synapse, or other cloud-based data services, exposure to DevOps CI/CD pipelines for SSIS/SSAS deployments, familiarity with MDX and DAX query languages, and certification in Microsoft SQL Server BI Stack will be advantageous. Strong analytical and problem-solving skills, effective communication, collaboration abilities, and the capacity to work independently while managing multiple tasks are qualities we are looking for in the ideal candidate.,
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us VE3 is at the forefront of delivering cloud‑native data solutions to premier clients across finance, retail and healthcare. As a rapidly growing UK‑based consultancy, we pride ourselves on fostering a collaborative, inclusive environment where every voice is heard—and every idea can become tomorrow’s breakthrough. Role: Database Designer / Senior Data Engineer What You’ll Do Architect & Design Lead the design of modern, scalable data platforms on AWS and/or Azure, using best practices for security, cost‑optimisation and performance. Develop detailed data models (conceptual, logical, physical) and document data dictionaries and lineage. Build & Optimize Implement robust ETL/ELT pipelines using Python, SQL, Scala (as appropriate), leveraging services such as AWS Glue, Azure Data Factory, and open‑source frameworks (Spark, Airflow). Tune data stores (RDS, SQL Data Warehouse, NoSQL like Redis) for throughput, concurrency and cost. Establish real‑time data streaming solutions via AWS Kinesis, Azure Event Hubs or Kafka. Collaborate & Deliver Work closely with data analysts, BI teams and stakeholders to translate business requirements into data solutions and dashboards. Partner with DevOps/Cloud Ops to automate CI/CD for data code and infrastructure (Terraform, CloudFormation). Governance & Quality Define and enforce data governance, security and compliance standards (GDPR, ISO27001). Implement monitoring, alerting and data quality frameworks (Great Expectations, AWS CloudWatch). Mentor & Innovate Act as a technical mentor for junior engineers; run brown‑bag sessions on new cloud services or data‑engineering patterns. Proactively research emerging big‑data and streaming technologies to keep our toolset cutting‑edge. Who You Are Academic Background: Bachelor’s (or higher) in Computer Science, Engineering, IT or similar. Experience: ≥3 years in a hands‑on Database Designer / Data Engineer role, ideally within a cloud environment. Technical Skills: Languages: Expert in SQL; strong Python or Scala proficiency. Cloud Services: At least one of AWS (Glue, S3, Kinesis, RDS) or Azure (Data Factory, Data Lake Storage, SQL Database). Data Modelling: Solid understanding of OLTP vs OLAP, star/snowflake schemas, normalization & denormalization trade‑offs. Pipeline Tools: Familiarity with Apache Spark, Kafka, Airflow or equivalent. Soft Skills: Excellent communicator—able to present complex technical designs in clear, non‑technical terms. Strong analytical mindset; thrives on solving performance bottlenecks and scaling challenges. Team player—collaborative attitude in agile/scrum settings. Nice to Have Certifications: AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate/Expert. Exposure to data‑science workflows (Jupyter, ML pipelines). Experience with containerized workloads (Docker, Kubernetes) for data processing. Familiarity with DataOps practices and tools (dbt, Great Expectations, Terraform). Our Commitment to Diversity We’re an equal‑opportunity employer committed to inclusive hiring. All qualified applicants—regardless of ethnicity, gender identity, sexual orientation, neurodiversity, disability status or veteran status—are encouraged to apply.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Expertise - MSBI Dev (ETL - SSIS, SQL) with Power BI Dev || Pune / Noida / Chennai / Bangalore We are seeking a highly skilled and experienced Specialist Power BI Developer to join our team, specifically focused on our client inquiry management project. As a Specialist Power BI Developer, you will be responsible for formulating and delivering automated reports and dashboards using Power BI and other reporting tools, with a specific focus on inquiry reporting metrics as MTTR, Avg. Aging, platform adoption etc. You will work closely with business stakeholders to understand their requirements related to inquiry management and translate them into functional specifications for reporting applications. You will have expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS), and a strong understanding of database concepts and data modeling. Responsibilities Formulate automated reports and dashboards using Power BI and other reporting tools, with a focus on inquiry reporting metrics. Understand the specific business requirements related to inquiry management and set functional specifications for reporting applications. Utilize expertise in SQL queries, Power BI, and SSIS to gather and analyze data related to inquiries for reporting purposes. Develop technical specifications from business needs and establish deadlines for work completion. Design data models that transform raw data related to inquiries into insightful knowledge by understanding business requirements in the context of inquiry reporting metrics. Create dynamic and eye-catching dashboards and reports using Power BI, highlighting key metrics and trends related to inquiries. Implement row-level security on data and comprehend Power BI's application security layer models, ensuring data privacy and confidentiality related to inquiries. Collaborate with cross-functional teams to integrate, alter, and connect data sources related to inquiries for business intelligence purposes. Make necessary tactical and technological adjustments to enhance the current inquiry management reporting systems. Troubleshoot and resolve issues related to data quality and reporting specifically focused on inquiries. Communicate effectively with internal teams and client teams to explain requirements and deliver solutions related to inquiry reporting metrics. Stay up to date with industry trends and advancements in Power BI and business intelligence for effective inquiry reporting. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field. Proven experience as a Power BI Developer or similar role, with a specific focus on reporting related to inquiries or customer service metrics. Expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS). Excellent communication skills to effectively articulate requirements and collaborate with internal and client teams. Strong analytical thinking skills for converting data related to inquiries into illuminating reports and insights. Knowledge of data warehousing, data gateway, and data preparation projects. Familiarity with the Microsoft SQL Server BI Stack, including SSIS, TSQL, Power Query, MDX, PowerBI, and DAX. Detailed knowledge and understanding of database management systems, OLAP, and the ETL framework. Proficiency in Microsoft Excel and other data analysis tools. Ability to gather and analyze business requirements specific to inquiries and translate them into technical specifications. Strong attention to detail and ability to QA and validate data for accuracy. Ability to manage multiple projects and deadlines simultaneously. Knowledge of Agile development methodologies is a plus. Ability to learn and adapt to new technologies and tools quickly. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 2 weeks ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Overview: Join global organization with 82000+ employees around the world, as a Lead Data warehouse Developer based in IQVIA Kochi/ Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. Technical Skills: Very good understanding of Data Warehousing Design Methodology, Data Warehouse Concepts, Database design/architecture including both OLAP & OLTP data and complete lifecycle of project from design to development and implementation. Extensive experience in Analysis, Design, Development, Implementation, Deployment and maintenance of Business Intelligence applications. Expertise in identification of Business and Data requirements and converting them into conceptual and logical data models. Excellent data-analysis skills and ability to translate specifications into a working Report. Experienced in Training & Supporting end-user reporting needs. Exposure to Dimensional Data modeling- Star Schema modeling and Snowflake modeling. Possess excellent communication, decision-making, Problem Solving, Analytical, and Interpersonal skills with result oriented dedication towards goals. Capable of working as a Team Member or Individually with minimum supervision. Flexible and versatile to adapt to any new environment with a strong desire to keep pace with latest technologies. Key Responsibilities: Minimum 10+ years of experience in IT industry; should have worked on a large-scale client implementation project. 4-6 in Managing the daily activities of the team responsible for the design, implementation, maintenance, and support of data warehouse systems and related data marts. Oversees data design and the creation of database architecture and data repositories. Drive change to implement an efficient and effective data-warehousing strategy. Ensure that projects are accurately estimated and delivered to schedule. Work closely with the business and developers on issues related to design and requirements. Actively contribute to the process of continual improvement, regarding self, team and systems Ensure that development standards, policies and procedures are adhered to. Will work with analysts from the business, your development team, and other areas to deliver data-centric projects. Will be responsible for the maintenance and improvement of existing data batch-processes and the implementation of new ETL and BI systems. Coordinate internal resources and third parties/vendors for the flawless execution of projects. Ensure that all projects are delivered on-time, within scope and within budget. Developing project scopes and objectives, involving all relevant stakeholders, and ensuring technical feasibility Hands-on expertise in ETL, DWH and BI development. Competencies: Strong adaptability and problem-solving skills in an agile work setting. Extensive hands-on experience with key Azure services and SQL scripting. Ability to develop and implement effectively. Proven analytical and troubleshooting skills. Excellent written, Oral communication. Qualifications: Graduate: B-Tech/ BE/ MCA Certifications: (Good to have): DWH, SQL, BI IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 2 weeks ago
9.0 - 12.0 years
35 - 40 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
The impact that you will be makingThe role would require you to develop sophisticated products that add value to the client andresult in new projects and revenue streams What this role entail Analyze database design and architecture requirements for the distributedapplications Design the databases for the microservices with well defined bounded context and interaction Develop, construct, test and maintain existing and new data-driven architectures Align architecture with business requirements and provide solutions which fits best to solve the business problems Data acquisition and clean up from multiple sources across the organization Design custom tools to collate the data Identify ways to improve data reliability, efficiency, and quality Develop automated tasks for data operations Deliver updates to stakeholders based on analytics Set up practices on data reporting and continuous monitoring using Prometheus, ELK or similar tool Responsible for designing, implementing, and managing our database systems with a strong focus on optimizing complex analytical data queries Collaborate with data engineers, analysts, and developers to ensure seamless data flow and accessibility Should have a deep understanding of OLAP architecture, cube design, andmulti-dimensional data modeling What lands you in this role Seven or more years as a hands-on technologist with a good mix of application development and database development Expert Level Sql, Postgresql, including stored procedures, functions, triggers, and views Strong understanding in database design and modelling Ability to efficiently write database code without compromising data quality, privacy or security Knowledge of database design principles, query optimization, index management, integrity checks, statistics, partitioning and isolation levels Good understanding of Postgresql replication, high availability and clustering across availability zones and regions Good understanding of Database backup and restore to align with data laws Participate in an agile software development life cycle including providing testing guidelines for database related changes Good understanding of Programming language, preferably Python for custom scripts Understanding No Sql and In memory databases is a value add
Posted 2 weeks ago
2.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies
Posted 2 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
Sr Data Engineer2 We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insightsorganization to build data solutions, design and implement ETL/ELT processes and manage our dataplatform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, ourvision is to spearhead technology and data-led solutions and experiences to drive growth & innovation atscale.The ideal candidate will have a strong Data Engineering background, advanced Python knowledge andexperience with cloud services and SQL/NoSQL databases.You will work closely with our cross functional stakeholders in Product, Finance and GTM along withBusiness and Enterprise Technology teams.As a Senior Data Engineer, you willCollaborating closely with various stakeholders to prioritize requests, identify improvements, andoffer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involvesconstructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineeringgroups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet businessneeds. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You''ll be a great addition to the team if you haveHold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintainingdata environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes,managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing,monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift,MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD usingGitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with variousbusiness stakeholders and translate requirements. Added bonus if you also haveA good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What you'll do Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive “can do” attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues on time Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Continuously evaluate and improve data architecture and processes Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor’s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, Teradata etc 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with commercial ETL tools - DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Excellent SQL and database management skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Applicant and Candidate Privacy Notice
Posted 2 weeks ago
0 years
6 - 9 Lacs
Calcutta
On-site
Job requisition ID :: 80285 Date: Jul 17, 2025 Location: Kolkata Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team As a member of the Operation, Industry and domain solutions team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals. We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change. Your work profile Business analyst shall be a graduate in any field with post graduate qualification in business administration or equivalent. Experience in Crafting and executing queries upon request for data, Presenting information through reports and visualization, In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experienced in study methods for system and functional requirement specification study. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Posted 2 weeks ago
10.0 years
0 Lacs
Andhra Pradesh
On-site
Position Overview This position’s primary responsibility will be to translate software requirements into functions using Mainframe, ETL frameworks, Database technologies. The chosen candidate will apply technical proficiency across different stages of the Software Development Life Cycle, gather accurate requirements and work closely with stakeholders to prioritize his tasks. The role will require strong attention to detail with the ability to identify errors and make adjustments in a testing environment while contributing towards developing and adhering to best practices for developing applications that are scalable, relevant, and critical to the project. Responsibilities Support, maintain and participate in the development of software utilizing technologies such as COBOL, DB2, CICS and JCL. Support, maintain and participate in the ETL development of software utilizing technologies such as Talend, Abinitio, AWS Glue, PySpark. Provide expertise, tools, and assistance to operations, development, and support teams for critical production issues and maintenance Troubleshoot production issues, diagnose the problem, and implement a solution - First line of defense in finding the root cause Work cross-functionally with the support team, development team and business team to efficiently address customer issues. Active member of high-performance software development and support team in an agile environment. Qualifications Required Skills: Strong Mainframe experience, using Cobol, JCL, DB2 and VSAM Strong SQL knowledge on OLAP DB platforms (Teradata, Snowflake) and OLTP DB platforms (Oracle, DB2, Postgres, SingleStore). Strong experience with Teradata SQL and Utilities Strong experienced with Oracle, Postgres and DB2 SQL and Utilities Develop high quality database solutions Ability to do extensive analysis on complex SQL processes and design skills Ability to analyze existing SQL queries for performance improvements Experience in ETL frameworks like Talend, Abinitio, AWS Glue. Experience in software development phases including design, configuration, testing, debugging, implementation, and support of large-scale, business centric and process-based applications Proven experience working with diverse teams of technical architects, business users and IT areas on all phases of the software development life cycle. Exceptional analytical and problem-solving skills Structured, methodical approach to systems development and troubleshooting Ability to ramp up fast on a system architecture Experience designing and developing process-based solutions or BPM (business process management) Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal/relationship management skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Desire to work in application support space Passion for learning and desire to explore all areas of IT Required Experience & Education: 10+ years in an application development role Bachelor's Degree or higher from an accredited university or a minimum of three (3) years of experience in software development in lieu of the bachelor’s degree education requirement Desired Experience: Exposure to AWS Healthcare experience About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 2 weeks ago
0.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
3 months ago TESCRA India DESCRIPTION Attached above is the detailed JD for your reference: What you'll Do System integration of heterogeneous data sources and working on technologies used in the design, development, testing, deployment, and operations of DW & BI solutions Create and maintain documentation, architecture designs and data flow diagrams Help to deliver scalable solutions on the MSBI platforms and Hadoop Implement source code versioning, standard methodology tools and processes for ensuring data quality Collaborate with business professionals, application developers and technical staff working in an agile process environment Assist in activities such as Source System Analysis, creating and documenting high level business model design, UAT, project management etc. Skills What you need to succeed 3+ years of relevant work experience SSIS, SSAS, DW, Data Analysis and Business Intelligence. Must have expert knowledge of Data warehousing tools – SSIS, SSAS, DB Must have expert knowledge of TSQL, stored procedure, database performance tuning. Strong in Data Warehousing, Business Intelligence and Dimensional Modelling concepts with experience in Designing, developing & maintaining ETL, database & OLAP Schema and Public Objects (Attributes, Facts, Metrics etc.) Good to have experience in developing Reports and Dashboards using BI reporting tools like Tableau, SSRS, Power BI etc. Fast learner, analytical and skill to understand multiple businesses, their performance indicators Bachelors' degree in Computer Science or equivalent Superb communication and presentation skills QUALIFICATIONS Must Have Skills BI DWH DATA WAREHOUSING SSIS SQL Minimum Education Level Bachelors or Equivalent Years of Experience 3-10 years ADDITIONAL INFORMATION Work Type: FullTime Location: Bangalore, Karnataka, India Job ID: Tescra-Ado-889DB6
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a BI Developer at NiCE, you will play a crucial role in developing Reports for a multi-region, multi-tenant SaaS product. Collaborating with the core R&D team, you will be responsible for creating high-performance Reports that cater to the use cases of various applications within the suite. Your impact will be significant as you take ownership of the software development lifecycle, encompassing design, development, unit testing, and deployment. You will work closely with QA teams to ensure the consistent implementation of architectural concepts across the product. Additionally, you will act as a product expert within R&D, understanding the product's requirements and market positioning. Collaboration with cross-functional teams such as Product Managers, Sales, Customer Support, and Services will be essential to ensure successful product delivery. Key responsibilities include designing and building Reports based on given requirements, creating design documents and test cases, developing SQL to address ad-hoc report requirements, conducting analyses, and creating visualizations and reports as per specifications. You will also be involved in executing unit testing, functional and performance testing, documenting results, conducting peer reviews, and ensuring quality standards are met throughout all stages. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Electronic Engineering, or equivalent from a reputable institute. With 2-4 years of BI report development experience, you should possess expertise in SQL and cloud-based databases, along with proficiency in BI tools such as Tableau, Power BI, or MicroStrategy. Experience in enterprise Data warehouse/Data Lake systems, analytical databases, and ETL frameworks is essential. Familiarity with Snowflake, database management systems, OLAP, and working in Agile environments is highly desirable. Joining NiCE offers you the opportunity to be part of an ever-growing, market-disrupting global company where the best talents collaborate in a fast-paced, innovative environment. As a NiCE team member, you will have access to endless internal career opportunities across various roles, disciplines, domains, and locations. If you are passionate, innovative, and eager to push boundaries, NiCE might just be the perfect fit for you. At NiCE, we operate according to the NiCE-FLEX hybrid model, allowing maximum flexibility with 2 days working from the office and 3 days of remote work each week. Office days focus on face-to-face meetings that foster teamwork, collaboration, innovation, and a vibrant atmosphere. Reporting directly to the Tech Manager, this role is classified as an Individual Contributor position at NiCE. NiCE Ltd. (NASDAQ: NICE) is a renowned provider of software products used by over 25,000 global businesses, including 85 Fortune 100 corporations. With a focus on delivering exceptional customer experiences, combatting financial crime, and ensuring public safety, NiCE software manages more than 120 million customer interactions and monitors over 3 billion financial transactions daily. Recognized for innovation in AI, cloud, and digital solutions, NiCE employs over 8,500 professionals across 30+ countries.,
Posted 2 weeks ago
8.0 - 13.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Manager Data Engineering What You Will Do Let’s do this. Let’s change the world. In this vital role you will lead and scale an impactful team of data engineers. This role blends technical depth with strategic oversight and people leadership. The ideal candidate will oversee the execution of data engineering initiatives, collaborate with business analysts and multi-functional teams, manage resource capacity, and ensure delivery aligned to business priorities. In addition to technical competence, the candidate will be adept at managing agile operations and driving continuous improvement. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Lead and manage a team of data engineers, ensuring appropriate workload distribution, goal alignment, and performance management. Work closely with business analysts and product collaborators to prioritize and align engineering output with business objectives. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Strong communication skills for collaborating with business and technical teams alike. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certification: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
About The Role & Team We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: • Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. • Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. • Cultivating collaboration with corporate engineering, product teams, and other engineering groups. • Leading and mentoring engineering discussions, advocating for best practices. • Actively participating in design and code reviews. • Accessing and exploring third-party data APIs to determine the data required to meet business needs. • Ensuring data quality and integrity across different sources and systems. • Managing data pipelines for both analytics and operational purposes. • Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: • Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. • Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. • Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. • Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. • Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). • Proficient in programming with Python or other scripting languages. • Have familiarity with columnar OLAP databases and data modeling. • Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. • Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: • A good understanding of Salesforce or Netsuite systems required • Experience in SAAS environments • Designed and deployed ML models • Experience with events and streaming data
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,
Posted 2 weeks ago
3.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are seeking a skilled and experienced SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) Developer to join our data engineering and business intelligence team. The ideal candidate will be responsible for developing, maintaining, and optimizing ETL packages and OLAP/Tabular models to support data warehousing and reporting Responsibilities : Design, develop, deploy, and maintain SSIS packages for data extraction, transformation, and loading (ETL). Develop and manage SSAS cubes (Multidimensional and/or Tabular Models) to support advanced analytics and reporting. Collaborate with business analysts, data architects, and stakeholders to understand data requirements. Optimize existing ETL processes for performance, scalability, and reliability. Create and maintain technical documentation, including data flow diagrams and data dictionaries. Monitor ETL workflows, troubleshoot issues, and implement data quality checks. Perform data validation and unit testing to ensure accuracy of ETL and cube outputs. Integrate SSIS/SSAS with Power BI, Excel, and other reporting tools. Participate in code reviews, sprint planning, and agile development Qualifications : Bachelors degree in Computer Science, Information Systems, or a related field. 3+ years of hands-on experience with SSIS and SSAS. Strong experience with SQL Server and T-SQL. Experience building both Multidimensional and Tabular SSAS models. Deep understanding of data warehousing concepts, star/snowflake schema, and ETL best practices. Familiarity with performance tuning in SSIS and SSAS. Proficient with data visualization tools like Power BI or Excel (PivotTables). Knowledge of version control systems such as Git or Skills : Experience with Azure Data Factory, Synapse, or other cloud-based data services. Exposure to DevOps CI/CD pipelines for SSIS/SSAS deployments. Familiarity with MDX and DAX query languages. Certification in Microsoft SQL Server BI Stack is a Skills : Strong analytical and problem-solving skills. Effective communication and collaboration skills. Ability to work independently and manage multiple tasks. (ref:hirist.tech)
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be monitoring and controlling all phases of the development process, providing user and operational support on applications to business users, and utilizing in-depth specialty knowledge of applications development to analyze complex problems/issues. You will be recommending and developing security measures in post-implementation analysis, consulting with users/clients and other technology groups, recommending advanced programming solutions, and ensuring essential procedures are followed while defining operating standards and processes. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and exercise independence of judgment and autonomy. As an Applications Development Senior Programmer Analyst, you will act as a Subject Matter Expert (SME) to senior stakeholders and/or other team members, assess risks when making business decisions, and drive compliance with applicable laws, rules, and regulations. You will also be responsible for developing Marketdata solutions for Risk and PnL systems and business-critical enhancements to Risk and PnL modules. To qualify for this role, you should have a minimum of 8 years of experience in software development, strong C# skills, strong analytical and problem-solving skills, experience in design and development, and the ability to work effectively in multi-disciplinary teams with low levels of supervision. Additionally, you should possess excellent C# and .NET skills, knowledge in TTD, OOAD, design patterns, Gemfire, Messaging Systems (EMS/Kafka), SQL Server, OLAP, SSAS, web development technologies such as ASP.NET, MVC, JavaScript, Ajax, and all-round technology skills. Knowledge of Python is optional but preferred. In terms of education, a good academic background with at least an Undergraduate degree, preferably in a Technical or Business-related subject, is required. Please note that this job description offers a high-level overview of the work performed, and other job-related duties may be assigned as required.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Staff Software Engineer specializing in Java at Walmart Global Tech in Chennai, you will play a crucial role in guiding the team in making architectural decisions and best practices for building scalable applications. Your responsibilities will include driving design, development, implementation, and documentation of cutting-edge solutions that impact associates of Walmart globally. You will collaborate with engineering teams across different locations, engage with Product Management and Business to drive product agendas, and work closely with architects to ensure solutions meet Quality, Cost, and Delivery standards. With a Bachelor's/Master's degree in Computer Science or a related field and a minimum of 10 years of experience in software design, development, and automated deployments, you will bring valuable expertise to the team. Your prior experience in delivering highly scalable Java applications, strong system design skills, and proficiency in CS fundamentals, Microservices, Data Structures, and Algorithms will be essential for success in this role. You should have hands-on experience with CICD development environments and tools like Git, Maven, and Jenkins, as well as expertise in writing modular and testable code using frameworks such as JUnit and Mockito. Your experience in building Java-based backend systems, working with cloud-based solutions, and familiarity with technologies like Spring Boot, Kafka, and Spark will be crucial. Additionally, you should be well-versed in microservices architecture, distributed concepts, design patterns, and cloud-native development. Your experience with relational and NoSQL databases, caching technologies, event-based systems like Kafka, monitoring tools like Prometheus and Splunk, and containerization tools like Docker and Kubernetes will be highly valuable. At Walmart Global Tech, you will have the opportunity to work in an innovative environment where your contributions can impact millions of people. The company values diversity, inclusion, and belonging, and offers a flexible, hybrid work environment along with competitive compensation, benefits, and opportunities for personal and professional growth. As an Equal Opportunity Employer, Walmart fosters a workplace culture where every individual is respected and valued, contributing to a welcoming and inclusive environment for all associates, customers, and suppliers.,
Posted 2 weeks ago
10.0 years
0 Lacs
Delhi, India
On-site
Where Data Does More. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers' Global Competency Centers located in India. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. This person will have a broad range of skills and experience ranging from data architecture to ETL, security, performance analysis, analytics, etc. He/she will have the insight to make the connection between a customer’s specific business problems and Snowflake’s solution, the customer-facing skills to communicate that connection and vision to a wide variety of technical and executive audiences, and the technical skills to be able to not only build demos and execute proof-of-concepts but also to provide consultative assistance on architecture and implementation. The person we’re looking for shares our passion for reinventing the data platform and thrives in a dynamic environment. That means having the flexibility and willingness to jump in and get it done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving data and analytics technologies, and working collaboratively with a broad range of people inside and outside the company to be an authoritative resource for Snowflake and its customers. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE YOU WILL: Be a technical expert on all aspects of Snowflake Present Snowflake technology and vision to executives and technical contributors to customers. Position yourself as a Trusted Advisor to key customer stakeholders with a focus on achieving their desired Business Outcomes. Drive project teams towards common goals of accelerating the adoption of Snowflake solutions. Demonstrate and communicate the value of Snowflake technology throughout the engagement, from demo to proof of concept to running workshops, design sessions and implementation with customers and stakeholders. Create repeatable processes and documentation as a result of customer engagement. Collaborate on and create Industry based solutions that are relevant to other customers in order to drive more value out of Snowflake. Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are correctly enabled and can extend the capabilities of Snowflake on their own. Maintain a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them. Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Be able to position and sell the value of Snowflake professional services for ongoing delivery OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Proficiency in implementing data security measures, access controls, and design within the Snowflake platform. Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with Python, Java , Spark and other Scripting languages Internal and/or external consulting experience. Deep collaboration with Account Executives and Sales Engineers on account strategy. BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: 1+ years of practical Snowflake experience Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Looker, Tableau, PowerBI) OLAP Data modeling and data architecture experience Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, GCP, etc.) Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience delivering data migration projects Expertise in a core vertical such as Financial Services, Retail, Media & Entertainment, Healthcare, Life-Sciences etc. Hands-on experience with Python, Java or Scala. WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE: Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 2 weeks ago
2.0 - 5.0 years
7 - 17 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design and implement scalable data models using Snowflake to support business intelligence and analytics solutions. Implement ETL/ELT solutions that involve complex business transformations. Handle end-to-end Data warehousing solutions Migrate the data from legacy systems to Snowflake systems Write complex SQL queries for extracting, transforming, and loading data, ensuring high performance and accuracy. Optimize the SnowSQL queries for better processing speeds Integrate Snowflake with 3rd party applications Use any ETL/ELT technology Implement data security policies, including user access control and data masking, to maintain compliance with organizational standards. Document solutions and data flows. Skills & Qualifications: Experience: 2+ years of experience in data engineering, with a focus on Snowflake. Proficient in SQL and Snowflake-specific SQL functions . Experience with ETL/ELT tools and cloud data integrations. Technical Skills: Strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpark, Snowpipe, Streamlit Experience in using Dynamic tables is good to have Familiarity with cloud platforms (AWS, Azure, or GCP) and other cloud-based data technologies. Experience with data modeling concepts like star schema, snowflake schema, and data partitioning. Experience with Snowflakes Time Travel, Streams, and Tasks features Experience in data pipeline orchestration. Knowledge of Python or Java for scripting and automation. Knowledge of Snowflake pipelines is good to have Knowledge of data governance practices, including security, compliance, and data lineage.
Posted 2 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Chennai
Work from Office
Role & responsibilities We are seeking a Data Modeller with over 12+ years of progressive experience in information technology, including a minimum of 4 years in a Data migration projects to cloud(refactor, replatform etc) and 2 years exposer to GCP. Preferred candidate profile In-depth knowledge of Data Warehousing/Lakehouse architectures, Master Data Management, Data Quality Management, Data Integration, and Data Warehouse architecture. Work with the business intelligence team to gather requirements for the database design and model Understand current on-premise DB model and refactoring to Google cloud for better performance. Knowledge of ER modeling, big data, enterprise data, and physical data models designs and implements data structures to support business processes and analytics, ensuring efficient data storage, retrieval, and management Create a logical data model and validate it to ensure it meets the demands of the business application and its users Experience in developing physical Model for SQL, No SQL, Key-Value pair, document database like Oracle, BigQuery, spanner, Postgresql, firestore, mongo DB etc Understand the data needs of the company or client Collaborate with the development team to design and build the database model for both Application and Datawarehousing development Classify the business needs and build both MicroServices & Reporting Database Model Strong hands on experience in SQL, Database procedures Work with the development team to develop and implement phase wise migration plan, go existing of on-prem and cloud DB, Help determine and manage data cleaning requirements
Posted 2 weeks ago
9.0 - 12.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Primarily looking for a candidate with strong expertise in data-related skills, including : - SQL & Database Management : Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake. - ETL/ELT Tools : Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines - Data Modeling & Optimization : Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. - Cloud & Security : Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). - Data Warehousing : Experience managing large datasets, data marts, and optimizing databases for performance. - Agile & CI/CD : Knowledge of Agile methodologies and CI/CD automation tools. Imp : The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems Responsibilities : - Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. - Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. - Work with data and analytics experts to strive for greater functionality in our data systems. - Assemble large, complex data sets that meet functional / non-functional business requirements. - Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. - Unit Test databases and perform bug fixes. - Develop best practices for database design and development activities. - Take on technical leadership responsibilities of database projects across various scrum teams. - Manage exploratory data analysis to support dashboard development (desirable) Required Skills : - Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). - Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. - Understanding of data modelling techniques and working knowledge with OLAP systems - Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. - In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. - Ability to fine tune report generating queries. - Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. - Understanding of index design and performance-tuning techniques - Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions - Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). - Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions - Exposure to Source control like GIT, Azure DevOps - Understanding of Agile methodologies (Scrum, Itanban)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough