Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and developing ETL processes using Talend Studio, specifically version 7.1.1, for data integration, data quality, and metadata management/governance. Your primary duties will include designing and implementing data ingestion pipelines, performing data migration and conversion tasks, and developing software applications while adhering to established development methodologies and standards. Collaboration with Business Analysts, Architects, and Senior Developers will be essential to establish the physical application framework and ensure the integration of suitable architectural patterns, performance considerations, and security measures. Additionally, you will be expected to automate end-to-end ETL processes for various datasets ingested into the big data platform. Required skills for this role include proficiency in file handling and integrations, Spark/Spark Streaming, Talend Big Data Tool Set, Linux, Python, and strong SQL knowledge. It would be advantageous to have strong database design skills and a good understanding of Agile development practices. Other responsibilities may include leading technical design sessions, documenting project artifacts, suggesting best practices and implementation strategies using Talend ETL tools, and staying informed about industry standards, methodologies, and best practices. Additionally, you may be assigned other duties as needed.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Sales Data Analyst will work with business stakeholders to understand and improve existing datasets, assist with new data development, and support Americas Sales Operations. This role is ideal for someone early in their analytics career who is eager to grow their technical and business acumen in a supportive, global environment. RESPONSIBILITIES : - Collaborate with the Vertiv Datalake team to support Americas datasets. - Assist in understanding ETL jobs and translating logic into business terms. - Investigate and help resolve data issues in coordination with regional and global teams. - Support validation of new data items and sources. - Contribute to best practices for data ingestion, modeling, and validation. - Assist in creating and maintaining documentation for processes, datasets, and workflows. - Learn and apply core concepts in data operations, automation, and visualization. QUALIFICATIONS : - Bachelors degree in a relevant field (e.g., Computer Science, Statistics, Business Analytics, Engineering). - 3 + years of experience in data-related roles. Internship or academic project experience will be considered. - Strong communication skills, especially when working with diverse, global teams. - Proficiency in SQL with the ability to read and interpret queries. - Exposure to Python for automation or data wrangling (preferred). - Experience or familiarity with Power BI or similar data visualization tools (preferred). - Demonstrates curiosity and problem-solving mindset. - Interest in growing technical and business domain expertise. - Ability to work independently and collaboratively. PHYSICAL & ENVIRONMENTAL DEMANDS : Ability to work in standard office environment,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Functional Data Modeler in the Mutual Fund industry, you will play a vital role in designing data models that accurately represent fund structures, NAV calculations, asset allocation, and compliance workflows. Your expertise in data modeling, combined with a deep understanding of mutual funds and BFSI domains, will be instrumental in creating schemas that meet operational and regulatory requirements. Your responsibilities will include collaborating with business analysts and product teams to translate functional requirements into effective data structures. It will be crucial for you to ensure that the data models you design comply with data privacy regulations, regulatory reporting standards, and audit requirements. Additionally, you will be responsible for building OLTP and OLAP data models to support real-time and aggregated reporting needs and documenting metadata, lineage, and data dictionaries for business use. To excel in this role, you must have a strong domain expertise in Mutual Fund/BFSI operations and a proven track record in data modeling for financial and regulatory systems. Proficiency in schema design on GCP platforms such as BigQuery and CloudSQL, as well as hands-on experience with modeling tools like DBSchema or ER/Studio, are essential skills required for this position. Preferred skills for this role include experience working with fund management platforms or reconciliation engines and familiarity with financial compliance standards such as SEBI and FATCA. Soft skills like strong business acumen and effective documentation capabilities will also be valuable in liaising between functional and technical teams successfully. By joining our team, you will have the opportunity to own critical financial data architecture, influence domain-driven modeling for financial ecosystems, and be part of a fast-paced data transformation journey in the BFSI sector. If you are looking to make a significant impact in the field of data modeling within the mutual fund industry, this role is perfect for you.,
Posted 2 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Pune, Bengaluru
Work from Office
Position: Sr. Data Analyst Experience: 5 - 9 years Location: Pune / Bangalore Job Description Summary: Key Skills : Strong SQL, Python, Pyspark, Jupyter Notebook, Agile / Scrum / Jira / Confluence, Microsoft Excel, Exposure to any cloud (GCP / AWS / Azure) would be a plus Job Description: Must-Have: 5 - 9 years of professional experience as a Data Analyst with good decision-making, analytical and problem-solving skills. Working knowledge / experience of Big Data frameworks like Hadoop, Hive and Spark. Hands-on experience in query languages like HQL or SQL (Spark SQL) for Data exploration. Data mapping: Determine the data mapping required to join multiple data sets together across multiple sources. Documentation - Data Mapping, Subsystem Design, Technical Design, Business Requirements. Exposure to Logical to Physical Mapping, Data Processing Flow to measure the consistency, etc. Data Asset design / build: Working with the data model / asset generation team to identify critical data elements and determine the mapping for reusable data assets. Understanding of ER Diagram and Data Modeling concepts Exposure to Data quality validation Exposure to Data Management, Data Cleaning and Data Preparation Exposure to Data Schema analysis. Exposure to working in Agile framework. SQL, Pyspark, Python with Banking Domain knowledge / Credit & Lending domain knowledge. Knowledge of Credit Risk Frameworks such as Basel II, III, IFRS 9 and Stress Testing and understanding their drivers - advantageous Retail Credit / Traded Credit knowledge - applications will be considered Good To Have: BFSI Domain knowledge Data Visualization - Tableau or Qlik Sense Exposure to Hadoop, Hive and ETL. Working knowledge of any cloud services like AWS or GCP or Azure. Any relevant certifications would be a plus. Role & Responsibilities: Take complete responsibility for the sprint stories' execution. Understand the business requirements from the product/project stakeholders and break the requirements into simpler stories and tasks and do the necessary mapping of the tasks to the logical model of the solutions. Mapping of business entities to technical attributes with the logic for transformation defined clearly. Be accountable for the delivery of the tasks in the defined timelines with good quality. Follow the processes for project execution and delivery. Follow agile methodology. Working with the team leads closely and contribute to the smooth delivery of the project. Understand/define the architecture and discuss the pros-cons of the same with the team. Involve in the brainstorming sessions and suggest improvements in the architecture/design. Working with other teams leads to getting the architecture/design reviewed. Keep all the stakeholders updated about the project, task status, risks, and issues if any.
Posted 2 weeks ago
5.0 - 10.0 years
14 - 24 Lacs
Hyderabad
Work from Office
Position: SQL Developer with Analytics Experience: 5+ Years Mandatory Skills: Very Strong in SQL, Complex SQL Query, Advanced SQL, Python, Tableau OR Power BI, Dashboards, Data Modelling And Data visualization. Location: Hyderabad Roles & Responsibilities: Be responsible for the development of the conceptual, logical, and physical data models. Work with application/solution teams to implement data strategies, build data flows and develop/execute logical and physical data models. Implement and maintain data analysis scripts using SQL and Python. Develop and support reports and dashboards using Google PLX/Data Studio/Looker. Monitor performance and implement necessary infrastructure optimizations. Demonstrate ability and willingness to learn quickly and complete large volumes of work with high quality. Demonstrate excellent collaboration, interpersonal communication and written skills with the ability to work in a team environment. Minimum Qualifications: 5+ years of solid hands-on experience with SQL ,Complex SQL, Google, Analytics and Dashboard development. Hands-on experience with design, development, and support of data pipelines Strong SQL programming skills (Joins, sub queries, queries with analytical functions, stored procedures, functions etc.) Hands-on experience using statistical methods for data analysis Experience with data platform and visualization technologies such as Google PLX dashboards, Data Studio, Tableau, Pandas, Qlik Sense, Splunk, Humio, Grafana Experience in Web Development like HTML, CSS, jQuery, Bootstrap Experience in Machine Learning Packages like Scikit-Learn, NumPy, SciPy, Pandas, NLTK, BeautifulSoup, Matplotlib, Statsmodels. Strong design and development skills with meticulous attention to detail. Familiarity with Agile Software Development practices and working in an agile environment Strong analytical, troubleshooting and organizational skills Ability to analyse and troubleshoot complex issues, and proficiency in multitasking Ability to navigate ambiguity is great BS degree in Computer Science, Math, Statistics or equivalent academic credentials Interested Candidates Share CV with updated projects to mail id dikshith.nalapatla@motivitylabs.com with below mentioned details for quick response. Total Experience: Relevant SQL Experience : Mention Tableau or Power Bi Experience: Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: UK Shift (yes/no): UK SHIFT 2:00PM TO 11:00PM ************* 5 DAYS WORK FROM OFFICE ****************
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough