# Minimum of four years’ work experience, preferably in product development.
# Bachelor’s degree in computer programming, computer science, or a related field.
Work Location – Chennai/Bangalore/ Hyderabad
Key Responsibilities –
# The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for the product team. The selected candidate will support our database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture consistency across ongoing projects.
# Design and develop scalable solutions to store and retrieve high-volume timeseries data for various use cases including fast data access and data science experimentation.
# Build and maintain data quality validation and testing software.
# Troubleshoot and optimize production issues.
# Create and maintain optimal data pipeline architecture.
# Assemble large, complex data sets that meet functional / non-functional business requirements.
# Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability.
# Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using AWS ‘big data’ technologies.
# Create data tools for analytics and data scientist team members that assist them in building and optimizing our products into an innovative industry leader.
Skill required for the job/professional skill set –
# Advanced working SQL knowledge and experience working with object Oriented databases, query authoring (SQL) as well as working familiarity with a variety of databases.
# Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
# Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
# Building processes supporting data transformation, data structures, metadata, dependency, and workload management.
# A successful history of manipulating, processing and extracting value from large, disconnected datasets.
# Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
# Strong organizational & communication skills.
Technology Skills –
# They should also have the following technology experience:
# Build scalable data pipelines to ingest & transform streaming & batch data using distributed technologies like Apache Spark, Kafka, AWS Kinesis.
# Relational SQL (Postgres preferred) and NoSQL databases (Mongo preferred)
# Complete project ownership and responsibility to drive partner integration from requirement to go live
# Maintain and publish digital application APIs / Web-services for various internal and external integrations 3. Provide status reporting regarding project milestones, deliverable, dependencies, risks, and issues, communicating across leadership for LOS and LMS application projects for Digital business
# Work closely with business team and help define tech requirements for LOS / LMS for integrations of new business partnerships
# Drive tech development with engineering team using agile principles, incorporating feedback from users, developers, business, and other stakeholders
# Work with partners to resolve their dev / uat queries and ensure partners go-live in a time-bound manner and have a hands-on approach for resolving issues across LMS and third-party interfaces for designated partners
# Work on agile application implementation projects for digital business while meeting development schedules and ensuring the delivered solution meets the technical specifications and design requirements.
Skill required for the job/professional skill set –
# Understanding of Loan Origination systems, lending software suites, partner API interface integrations.
# Extensive knowledge of APIs Technologies (Postman / SoapUI)
# AWS Cloud Basics
# Good Communication skills for external and internal stakeholder manager
# Proven ability to work creatively and analytically in a problem-solving environment demonstrating teamwork, innovation, and excellence
# Self-motivated, decisive, with the ability to adapt to change and competing demands
# 6-8 years of full-time work experience in tech companies or digital banking/fin tech
# Excellent working knowledge of various financial lending products like retail loans, consumer durable loans, business loans, line of credit, credit cards etc
# Domain Exposure in Financial Services and experience in Digital Lending, Banks or NBFCs
# Worked on integration projects using Agile framework
Academic Requirements – Bachelors/ master’s in engineering
Work Location – Chennai, Bangalore, Hyderabad
# Lead and assist with Dashboard and advanced analytics projects
# Create Data Models, ETL jobs, and processes and executive Operational Dashboards and Analytical reports
# Document findings and lessons learned from POCs and discovery projects
# Work in a time constrained environment to analyze, design, develop and deliver customer centric solutions and applications
# Optimize the dashboard for best user experience.
# Should provide out of the box solution to critical business problems
# Participate in solutioning meetings and convert discussion into requirements and design
# Communicate technical and business topics, as appropriate, in a 360-degree fashion, when required; communicate using written, verbal and/or presentation materials as necessary.
# Utilize technical and domain knowledge to develop and implement effective solutions; provide hands on mentoring to team members through all phases of the Systems Development Life Cycle (SDLC) using Agile practices.
Required Skill –
# SQL, PowerBI, ADF
# Experience in Dashboard migration from one platform to Power BI preferred
# Experience in Power BI Direct query preferred
# Experience in DAX Studio, SSMS, Azure are preferred
Qualification – Bachelor’s/ master’s degree in engineering
Work Location – Chennai, Bangalore, Hyderabad or WFH
# Create Data Models by extracting the data from various data sources by connecting through JDBC, ODBC, and OLEDB. # Develop Data Model by Data Virtualization techniques using Denodo platform by connecting to multiple data sources such as SQL Server, Oracle, Hadoop etc., # Develop business requirements, Dashboard development, and data verification and analysis. # integrating Denodo with Oracle, SQL Server, MySQL Workbench databases using JDBC. # Manage end to end data flow using virtualization layer at Data Warehouse. # Excellent knowledge of SQL Query optimization, Stored Procedures, Packages, Database Triggers, Views, Functions. # Publishing data sources on REST and SOAP based web services. # Complete lifecycle Implementation of Business Intelligence with Star and Snowflake Schemas, Fact and Dimensional Tables.
Required Skills –
# Data Modelling, SQL, ETL Tools, Python, Cloud (Azure & AWS) # Strong working knowledge in any of the ETL tools # Experience in developing Conceptual, Logical modeling and Physical database design for OLTP and OLAP systems.