Data & Analytics Services

Businesses leverage our high level of expertise in big data platforms, and the extensive, hands-on experience of our certified consultants, to gain control of their data environment and start driving sustainable solutions. Our customers benefit from accelerated project delivery, reduced costs, increased agility, and embedded best practice, governance and security.

Data & Analytics Consultancy

Whether you are planning a new data initiative, revising your strategy, upgrading your architecture or simply want to improve outcomes, WHISHWORKS can help you gain clarity on how to best achieve your business and technology objectives.

With a proven discovery methodology, we help you to develop a broader scope for the initiative aligned to the business objectives, and to create a minimum viable product, an optimal solution and the shortest path to achieve your objectives.

Our approach

Our Data & Analytics Consultants with work closely with you to:

    Scope project requirements
    Record data source systems, data integration and quality standards, data extraction methodologies and interfaces
    Document user stories, architectural and non-functional requirements
    Architect and implement a proof of concept

 

What we deliver

At WHISHWORKS, we begin with an initial project charter document to scope project requirements. Following this, we draft a project plan containing an initial schedule and an estimate of required effort for its implementation.

What follows is a high-level technical architecture document detailing recommendation for the architecture needed to support the proposed solution, and a high-level user story backlog describing an agreed understanding of requirements.

Application Delivery

You are ready to solve your problem but not sure of what would be the right architecture, platform and tools to use.

To build a robust data management and analytics platform, the first and most important step is to have the right foundations, i.e, clear cut business requirements, meticulous understanding of workload, security and capacity. Secondly, the platform has to support data and analytics at scale. With our experience in building systems to handle petabytes of data and scale it to hundreds of nodes, you will be confident that the desired form and function are going to be met.

Our approach

Our Data & Analytics Consultants work closely with you to:

    Understand platform requirements, security, governance and information standards
    Architect and design the platform based on your application, infrastructure and security requirements
    Size and plan the platform capacity based on historical and future data volumes and workload
    Design cluster security based on the core security standards and regulatory mandates
    Platform installation with all required components as per the design
    Implementation of required security configurations
    Toolset installation to ensure constant monitoring and proactive alerting

What we deliver

Our experts will deliver a fully optimised and secured data management platform in line with your organisational standards. We also provide an operational guide and runbook with detailed information on day-to-day cluster management. And, to ensure your investment is sustainable, we offer knowledge transfer on cluster operations and management.

Data Integration Service

Connecting Applications, Data and Devices

Analysing data stored in silos has limited value. With an integrated data lake, you can unlock insights that would have otherwise been missed.

WHISHWORKS provides a unified solution for Data Integration: building, deploying, and managing real-time data-centric architectures in a big data environment.

Once the big data platform is setup, our data integration services aim to achieve multiple elements of integration, like batch and real-time data ingestion, from identified sources. The data transformation, data verification and quality activities ensure that information available is up to date, accurate, and consistent across systems.

Our approach

Our Data & Analytics Consultants will work closely with your team to:

    Scope data ingestion, transformation and quality management requirements
    Record data source systems, data consuming platforms, data formats, data mappings, business rules, data transformation, data quality, data extraction methods and interfaces
    Understand and document non-functional requirements like data volume and throughput.
    Design key data processing frameworks/engines required for the implementation
    Design detailed technical architecture and ensure adherence of end-to-end data lifecycle
    Implement an end-to-end data integration solution
    Perform unit, functional and performance testing to ensure business and non-functional requirements are met

Managed Services

Designed to give you the responsiveness and flexibility you need to turn data into actionable insights, our comprehensive managed services model lets you focus on your business strategy instead of worry about the technology infrastructure supporting it.

  • Core Platform Support

    • Cluster monitoring & health checks
    • Patching & minor upgrades
    • User access management
    • Performance tuning & optimisation
    • Incident resolution & root-cause analysis
    • Cluster support ensuring high availability
    • Hadoop Jobs monitoring, application enhancements
      • and on-going application support
  • Project Delivery

    • Platform to-Business alignment
    • Platform transition & change
    • Platform modernisation
    • Platform component standardisation
    • E2E data pipelines transformation
    • Data analysis and assessment
    • Data integration & architecture
    • Data engineering
    • Integration with existing Enterprise Data Warehouse & Sources
  • DevOps-as-a-Service

    • Platform Automation
    • Continuous integration
    • Continuous release & deployment
  • Service Improvements

    • Continuous monitoring
    • SLA improvements
    • Regular analysis & reporting

Data Platform Migration Services

The recent merger and acquisition announcements within the big data ecosystem brought a level of confusion and uncertainty. Many on-premise platform users are worried about the volatility of their platform, potential risks for their data and applications, as well as the suitability and viability of alternative courses of action. We can help.

Some of our customers are in the process of assessing or planning the migration of their on-premise big data platform to a cloud-native data platform. Our team of experienced architects and data engineers can guide you through the design and implementation of a complete migration (data, apps and workflows) from your on-premise platform to a cloud-native data platform (eg AWS, Google GCP, or MS Azure).

 

Eliminate Risks

Our expert team of data consultants regularly conduct discovery sessions and more in-depth risk assessment engagements, to help companies choose the best option for their current technology stack and strategic roadmap.

Whether you decide to maintain, upgrade or migrate your on-premise data platform, our data engineers will align with your teams to accelerate timelines, eliminate risk, and minimise impact to your daily operations.

WHISHWORKS’ solution enabled us to achieve our business objectives and goals with increased service levels, demonstrating a high level of expertise by effectively implementing the project to the agreed timelines and going a level beyond what we thought was achievable.

Lee Taaffe, Business Information Analyst at LGC | Data & Analytics

Data & Analytics Solutions

Hortonworks Data Platform Support

Our Data & Analytics team will work with you to plan an approach and solution that best meets your needs. Together we will create a plan based on your technical environment, goals and projects.

If your Hortonworks platform is an earlier version than HDP 3.1, then you are probably assessing your options, as your existing platform technology stack may have already reached or will soon reach End of Life. At WHISHWORKS we have been working with all the major on-premise, hybrid and cloud Big Data platforms, including Hortonworks and Cloudera, and identified the following four paths dependent on your current HDP version and strategic roadmap.

  • Our Risk Assessment Promise

    Our consultants will work with you to analyse your Big Data ecosystem, use cases and goals, and provide you with the most efficient and cost-effective plan for your Hortonworks Data Platform. This is a standalone service offered by expert, certified consultants with years of experience working with Hortonworks Data Platform.

    At the end of the engagement, you will have a detailed assessment of your environment and recommended options, alongside a blueprint of the easiest path to continuity and expansion.

  • WHISHWORKS Support

    Whether you decide to maintain, upgrade or migrate your Hortonworks Data Platform, WHISHWORKS data engineers will align with your teams to minimise risks, maximise benefits and ensure your Big Data initiatives continue uninterrupted.

Confluent & Apache Kafka Support

Our event streaming practice is underpinned by a team of expert senior Consultants with deep expertise in Confluent & Kafka deployments requiring high-availability, enterprise compliance, high-scalability, and industrial, robust operations.

  • Design & Architecture

    • Kafka architecture review and assessment
    • Architectural design and discovery
    • Kafka sizing and platform requirements
    • Kafka topology design
    • Identification of use cases
  • Deployment & Implementation

    • Data security, governance & encryption
    • Kafka Producers and Consumers
    • Data Pipelines, robust data delivery & processing
    • Multi Datacentre design and deployment
    • Data centre synchronisation and replication
    • Real-time stream processing applications
    • Real-time streaming analytics
    • On-premise / hybrid / cloud deployment
  • Support

    • High Availability
    • Disaster Recovery
    • Broker optimisation
    • Performance and
    • Capacity Planning
    • Zookeeper
    • Support and issues management on existing open source
    • Kafka clusters
      24×7 or tailored support packages
    • Incident management and reporting
    • On-site, off-site capabilities
  • Managed Services

    • Cluster administration
    • E2E Platform
    • Management
    • Tailored services

Apache Spark Support

Apache Spark consulting, implementation, optimisation and support


Our Spark Specialism

Apache Spark is a fast and general engine for Big Data processing, with built-in modules for streaming, SQL, Machine Learning and graph processing. At WHISHWORKS we have worked extensively with Apache Spark in many Big Data projects:

• Implementation of robust production data pipelines at scale.

• Implementation of multiple “Spark and NiFi” based IoT pipelines.

• Numerous projects requiring Spark applications to perform efficiently on Yarn clusters.

• Introduction of SMACK (Spark, Mesos, Akka, Cassandra, and Kafka) stack into our Big Data roadmap.

• Development of reusable component registries, based on our extensive production experience to help reduce development time for building enterprise grade search solutions using Spark and Apache Solr, by almost 50%.

• Extensive experience into building and running production grade Data pipelines on cloud platforms like AWS and Azure.

• Multiple use cases involving streaming data processing, interactive analytics, batch processing and Machine Learning.

  • Consulting

    • Needs Analysis
    • Architectural Consulting
    • Spark Cluster Architecture Review & Design
    • Identification of Use Cases

  • Managed services

    • Cluster Administration & Optimisation
    • Tailored Services
    • Staff Augmentation

  • Application support

    • High performance Spark applications implementation. Real-time / batch / streaming / offline analytics
    • Full Spark stack delivery: Spark SQL, SparkML, Spark Streaming, Spark GraphX
    • Deliver high quality SQLs that run seamlessly on Spark engine backed by AWS (S3 and Redshift) or Azure Blob/Table Storage
    • Deliver high performance Spark-based data pipelines by strictly following Test Driven Development approach

  • Deployment & Application Delivery

    • Support and Issues Management on Existing Open Source Spark clusters.
    • Support to maintain Spark SLAs and SLOs consistently. Spark SQL read/writes speed optimisation.
    • Spark multi-user, cluster sharing.
    • 24×7 or tailored support packages, Incident management and reporting.