Change these settings in elasticsearch.yml

  • Allow reindex from a remote host
reindex.remote.whitelist: "10.5.1.2:9200"
  • Skip verify ssl
reindex.ssl.verification_mode: none

Export list index name from old cluster

$ curl -XGET https://10.5.1.2:9200/_cat/indices -u 'admin:xxx' --insecure > indices.txt

$ grep "filebeat" indices.txt > filebeat.txt

$ cat filebeat.txt
yellow open filebeat-7.10.1-2021.02.08     ux0_0ABgRjymnXVYZ1ipqw 1 1  419946  0  107.4mb  107.4mb
yellow open filebeat-7.10.1-2021.02.09     sW-H7t10Qlu_47ayzrNF3Q 1 1  410038  0    101mb    101mb
yellow open filebeat-7.12.1-2021.04.29     BsG7iUJ_S9WPd9lxhQ2oTA 1 1    1130  0  483.6kb  483.6kb
yellow open filebeat-7.12.1-2021.04.28     pCUjavtuQLC6Fy0MMGN_eA 1 1      21  0   29.7kb   29.7kb

$ cut -d' ' -f3 filebeat.txt > filebeat_indexname.txt

Recreate Ingest pipelines & templates

Open Dev tools from Kibana on old cluster

GET _ingest/pipeline
GET /_template

Get all pipelines & templates then recreate them on the new cluster

PUT _ingest/pipeline/filebeat-7.12.0-fail2ban-log-pipeline
{
    "description" : "Pipeline for parsing fail2ban logs. Requires the geoip plugin.",
    "processors" : [
      {
	  ...
	  }
}

PUT _template/filebeat-7.11.2
{
    "order" : 1,
    "index_patterns" : [
      "filebeat-7.11.2-*"
    ],
    "settings" : {
      "index" : {
        "mapping" : {
          "total_fields" : {
            "limit" : "10000"
          }
        },
		...
	}
}

Create a script to migrate data

#!/bin/bash

input="filebeat.txt"
while IFS= read -r line
do
echo "Index $line"
  curl -XPOST -H 'Content-Type: application/json' https://127.0.0.1:9200/_reindex -u 'admin:xxx' --insecure -d'{
    "source": {
      "remote": {
        "host": "https://10.5.1.2:9200",
        "username":"admin",
        "password":"xxx"
      },
      "index": "'$line'"
    },
    "dest": {
      "index": "'$line'"
    }
  }'
echo ""
echo "----------"
sleep 2
done < "$input"

Import data

$ bash reindex.sh