PHINSYS
The Phinsys Suite is supported by a Processing Framework that underpins the full product set, providing the functionality required to integrate any source system.  Through investment in R&D and constant innovation, Phinsys is able to offer market-leading solutions through its best-in-class modules or via its complete stack.
TECHNOLOGY
PROCESSING
TECHNOLOGY
PROCESSING
Technology
Processing

Configurable connections
File or database input
Integration with most OLEDB providers

MS SQL 2016 and above

Calculations
Rules
Parameterisation

Processing framework engine

MI reporting
Drill to detail reports
Dashboards in Power BI
Integration with 3rd party tools
Integration with Excel

Web interface
Zero-Touch installation
Integrated security

Source

Phinsys has experience with all the major insurance platforms and has expertise in mapping-in from dozens of source systems. The product allows connectivity to all OLE DB sources as well as custom and bespoke imports of file-based data.

Data Storage

Recommended use of SQL server 2016 SP1 and above, as advances in database technologies (such as column store indexes and partitioning) further improve ETL and reporting performance.

Meta Data Management

User-driven calculation and data-filtering rule interface that supports parameterisation at different scopes.

ETL

An intelligent, versatile, scalable and efficient ETL framework designed to manage and orchestrate the processing of data from source systems all the way through to the reporting layer.

Visualisation

The Phinsys reporting layer supports industry-leading client reporting tools including Power BI, Tableau, Business Objects etc. Inform provides a seamless interaction with Power BI allowing dashboards, reports and KPIs to be fully integrated. Core reporting is provided through standard analytical tools like SSAS and SSRS, accessible via Excel or Power BI.

Application

Phinsys has experience with all the major insurance platforms and has expertise in mapping-in from dozens of source systems. The product allows connectivity to all OLE DB sources as well as custom and bespoke imports of file-based data.

User workflow for approval
Parallel execution
Resumable

Design-time deployment
Modular control flows

Fact builder
Dimension builder
Allocation task
Extract and stage task
Bespoke task integration

Data source mappings
Data flow configuration
Connections
Parameterised

Scale-up or Scale-out
Fast
Recoverable
Auditing and logging

Workflow and Orchestration

Approval process for file uploads that provides a validation process before data acceptance. A GUI-based dataflow setup that allows custom ETL and data processing, with dependency constraints between tasks for optimal control flow. Execution of processing tasks is done with maximum throughput.

Process Deployment

All front-end configured processes are deployed to a shared location and are encapsulated as SSIS packages.

Task Definition

Proven design patterns that are coded to deploy consistent processing logic for common tasks. These tasks are provided out-of-the-box for quick and simple configuration. Support for incremental data processing is inherent in all tasks, providing fast and streamlined processing.

Meta Data Driven

A fast and simple interface to map data sources, with automated mapping of matching field names. Easily migrate between environments by simply updating connection configuration and using parameters.

Data Processing

A scalable and efficient ETL framework that reduces overnight batch processing times. Integration services tasks are deployable from the front-end and are metadata driven, using tried and tested design patterns for the most efficient and fault tolerant ETL processing. ETL tasks are designed to be incremental and resumable, ensuring data integrity and coherence at all times.

SOURCE

Configurable connections
File or database input
Integration with most OLEDB providers
Phinsys has experience with all the major insurance platforms and has expertise in mapping-in from dozens of source systems. The product allows connectivity to all OLE DB sources as well as custom and bespoke imports of file-based data.
DATA STORAGE

MS SQL 2016 and above
Recommended use of SQL server 2016 SP1 and above, as advances in database technologies (such as column store indexes and partitioning) further improve ETL and reporting performance.
META DATA MANAGEMENT

Calculations
Rules
Parameterisation
User-driven calculation and data-filtering rule interface that supports parameterisation at different scopes.
ETL

Processing framework engine
An intelligent, versatile, scalable and efficient ETL framework designed to manage and orchestrate the processing of data from source systems all the way through to the reporting layer.
VISUALISATION

MI reporting
Drill to detail reports
Dashboards in Power BI
Integration with 3rd party tools
Integration with Excel
The Phinsys reporting layer supports industry-leading client reporting tools including Power BI, Tableau, Business Objects etc. Inform provides a seamless interaction with Power BI allowing dashboards, reports and KPIs to be fully integrated. Core reporting is provided through standard analytical tools like SSAS and SSRS, accessible via Excel or Power BI.
APPLICATION

Web interface
Zero-Touch installation
Integrated security
Accessible through a web-browser to an internally managed portal that requires no deployment to client desktop machines. Integrated security driven by Active Directory, where groups can easily be assigned to application roles.
WORKFLOW &
ORCHESTRATION

User Workflow for approval
Parallel execution
Resumable
Approval process for file uploads that provides a validation process before data acceptance. A GUI-based dataflow setup that allows custom ETL and data processing, with dependency constraints between tasks for optimal control flow. Execution of processing tasks is done with maximum throughput.
PROCESS DEPLOYMENT

Design-time deployment
Modular control flows
All front-end configured processes are deployed to a shared location and are encapsulated as SSIS packages.
TASK DEFINITION

Fact builder
Dimension builder
Allocation task
Extract and stage task
Bespoke task integration
Proven design patterns that are coded to deploy consistent processing logic for common tasks. These tasks are provided out-of-the-box for quick and simple configuration. Support for incremental data processing is inherent in all tasks, providing fast and streamlined processing.
META DATA DRIVEN

Data source mappings
Data flow configuration
Connections
Parameterised
A fast and simple interface to map data sources, with automated mapping of matching field names. Easily migrate between environments by simply updating connection configuration and using parameters.
DATA PROCESSING

Scale-up or Scale-out
Fast
Recoverable
Auditing and logging
A scalable and efficient ETL framework that reduces overnight batch processing times. Integration services tasks are deployable from the front-end and are metadata driven, using tried and tested design patterns for the most efficient and fault tolerant ETL processing. ETL tasks are designed to be incremental and resumable, ensuring data integrity and coherence at all times.