Data Engineer - Quezon City, Pilipinas - Joseph Riley Recruitment Services Inc

    Default job background
    Paglalarawan
    Direct Hire, Permanent Employment
    Quezon City | Hybrid, 2-3 days on-site
    Midshift Schedule

    Responsibilities:
    • Design and develop data pipelines and ETL jobs using Big Data Technologies based on functional/non-functional business requirements
    • Design & implement Data Integration/Ingestion/Extraction solutions based on high level architecture design
    • Identify, design and implement process improvements & delivery optimizations
    • Collaborate with Stakeholders, Business Analysts and Data Architects to assist and translate business requirements to technical solutions
    • Develop big data and analytic solutions leveraging new or existing technology to advance Manulife's all lines of business
    • Exploratory data analysis; Query and process on-premise or cloud-based data, provide reports, summarize and visualize the data
    • Design, upgrade and implement new data workflows, automation, tools and API integrations.
    • Perform POC on new integration patterns and solutions.
    • Write and maintain technical documentation
    • Perform unit tests and system integration tests
    • Executes updates, patches, and other activities required to maintain and enhance the operations of on-premise or cloud-based environments
    • Supports the Agile delivery squads when required
    Ideal Candidate:
    • At least 2 years experience as Data Engineer with focus on big data processing and/or relational databases
    • At least 2 years experience working with Microsoft Azure Data Platform, specifically Azure Data Lake, Azure Data Factory, Azure Databricks
    • Experienced in any of the following programming/scripting languages (SQL, Python, Shell, Scala)
    • Experienced in creating data pipelines & developing complex and optimized queries
    • Experienced with working on Structured, Semi-Structured, Unstructured datasets
    • Knowledgeable with any of the Big Data tools and technologies: Hadoop, Spark, Hive, Sqoop, Kafka, Nifi
    • Knowledgeable with relational SQL and NoSQL databases: MSSQL, Postgres, HBase (MongoDB is a plus)
    • Experienced with Workflow Management Tools: Airflow, Crontab, CA Workload Automation
    • Knowledgeable with CI/CD tools
    • Knowledgeable or at least have the basic concept of Data Visualization in any of the following tools: Tableau, PowerBI, QlikView/QlikSense
    • Knowledgeable in using collaboration tools (eg. MS Teams/Skype, Confluence, JIRA)
    • Experience with any SLDC Methodologies and familiarity with different Agile methodologies
    • Demonstrates a commitment to delivering excellent service, balanced with appropriate risk management.
    • Monitor, validate, and drive continuous improvement to methods, and propose enhancements to data sources that improve usability and results
    • Good Communication and presentation skills
    • Analytical, structured, organized, and proactive
    • Stakeholder and project management is a plus