Job Description
Principal Duties and Responsibilities :
Participates in Agile sprints and ceremonies; supports rapid iteration and development
Develops, maintains, and tests data pipelines, application framework, infrastructure for data generation; works closely with information architects and Data Scientists
Experience initializing and managing cloud platforms : AWS, Azure
Builds code to specifications and standards
Contributes code at a BU level
Greater positive impact to specific organizational entities
Contributes to the success of a team Job Requirements :
Familiarity with agile and DevOps principles, test-driven development, continuous integration, and other approaches to accelerate the delivery of new features
Understanding of software development lifecycle
Understanding of how technology supports Manulife business strategy
Familiar with database technologies such as : MySQL, Postgres, Hadoop, Spark
Familiar with Data Quality technologies such as : Informatica
Strong working knowledge of key aspects of role
Staying abreast of developments in own technical discipline and able
to recognize and translate these to own working environment
Participates in functional demos utilizing new tech; less controls
required
Sees actions as a series of linked steps
Understands informal structure, company culture, cooperation & information sharing within the company
Collaborative attitude, willingness to work with team members; able to coach, participate in code reviews, share skills and methods
Constantly learns from both success and failure
Good organizational and problem-solving abilities that enable you to manage through creative abrasion
Good verbal and written communication; able to effectively articulate technical vision, possibilities, and outcomes
Experiments with emerging technologies and understanding how they will impact what comes next. Decision Authorities :
Straightforward tasks likely to be completed to an acceptable standard
Able to achieve some steps using own judgement; supervision needed
Drives less than 30% of the time in paired programming
Possesses analytical skills and appreciate complex situations but only able to achieve partial resolution
Required Knowledge and Skills :
Demonstrated 2-3 years professional experience in big data technology and modern data architecture and modeling skills, including a university degree in Engineering, Computer Science or equivalent program;
Extensive expertise in data technologies and the use of reporting and analytics tool development. Particular focus on understanding the use of Cloud (Azure) and Hadoop-based technologies and programming or scripting languages like Java, Scala, Linux and / or C++.
Also working knowledge of regarding different (NoSQL or RDBMS) databases such as Hawq / HDB, MongoDB, Cassandra or Hbase;
Working experience with modern data streaming process with Kafka, Apache Spark, Flink and data ingestion framework NiFi, Hive, Pig, etc.
Extensive experience in modern data architecture, modeling and semantic layer design. Knowledge in insurance data model is preferred.
Extensive experience in Master Data Management practices and principles
Understanding of security / data protection solutions, Kerberos, Active Directory, HDFS LDAP
Experience and capability in translating non-technical user requests into complex technical specifications and solutions that meet these requirements;
Excellent organizational and time management skills, with ability to multi-task. Ability to work with minimal or no supervision while performing duties;
has the ability and initiative to organize various functions necessary to accomplish department activities or goals and be a strong team player.