Linkedin

  • Home >
  • Migrate Hadoop data to Amazon S3 by using WANdisco LiveData Migrator

Migrate Hadoop data to Amazon S3 by using WANdisco LiveData Migrator

Project Overview

Project Detail

Prerequisites

  • Hadoop cluster edge node where LiveData Migrator will be installed. The node should meet the following requirements:

    • Minimum specification: 4 CPUs, 16 GB RAM, 100 GB storage.

    • 2 Gbps minimum network.

    • Port 8081 accessible on your edge node to access the WANdisco UI.

    • Java 1.8 64-bit.

    • Hadoop client libraries installed on the edge node.

    • Ability to authenticate as the HDFS superuser (for example, "hdfs").

    • If Kerberos is enabled on your Hadoop cluster, a valid keytab that contains a suitable principal for the HDFS superuser must be available on the edge node.

    • See the release notes for a list of supported operating systems.

  • An active AWS account with access to an S3 bucket.

  • An AWS Direct Connect link established between your on-premises Hadoop cluster (specifically the edge node) and AWS.

Product versions

  • LiveData Migrator 1.8.6

  • WANdisco UI (OneUI) 5.8.0

https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/migrate-hadoop-data-to-amazon-s3-by-using-wandisco-livedata-migrator.html?did=pg_card&trk=pg_card

To know more about this project connect with us

Migrate Hadoop data to Amazon S3 by using WANdisco LiveData Migrator