DNS DHCP IPAM

Reply
Highlighted
Accepted Solution

Automate copying zones/records to a different DNS view

Guru
Posts: 54
5317     0

I’m playing with DNS views in my lab environment and would like to know if there is an easy and convenient way to copy entire zones from one DNS view to another.  I have a total of 52 zones I need to copy to a different view.  One of the zones will need to have some significant modifications before importing into the new view.  The rest of the zones can be copied over as-is.   I will be doing this many times in the process of finalizing a process to modify our production grid.

 

Right now I’m exporting all 2,000 zones to a CSV file and using a text editor to remove the zones I don’t want to copy.  I run a search and replace on the DNS view name, then I import this CSV into the new DNS view.  This creates the “empty” zones.  I have to manually create a CSV export file for each of the 52 zones in order to populate the records in these zones.  This is tedious and time consuming.

 

I searched the web for some ideas on how to automate this process and found the ibcli package.  It doesn’t look like it’s supported or updated any more.  I also checked out the restful api.  It looks like objects can be created and updated.  I’m pretty sure the perl api can accomplish this too.

 

Automating the process of copying zones to a different DNS view seems like a fairly common thing to do.  Before I sit down and code something, I’d like to see if someone else has already done this.  I don’t want to reinvent the wheel if I don’t have to.  If anybody has any information on this I’d appreciate it.  Otherwise, I’ll have to code something from scratch.

 

Thank you,

Clark

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

Looks like I should have created this post in the "API and Integration" section.  Can someone move it?

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

I've written a bash shell script which uses the RESTful API to do what I need.  I have one minor issue with the upload script.  When I get that worked out I will post the script(s) here.

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

Here is a bash shell script that uses the RESTful API to mass download zone files in csv format.  No need to have perl or python etc.

 

#!/bin/bash -

# This script reads a file, zonelist.txt, that contains a 
# list of zone files to download.  A separate csv file will be
# created for each zone.  The csv file will be named the
# same as the zone name.  Minimal error checking is
# performed.  Use at your own risk.
# All files are located in the same directory as this script.

# Username and password with permission to download csv files
USERNAME="admin"
PASSWORD="password"

# Grid Master
SERVER="gm.example.com"

# Define file containing list of zones to export
ZONELIST="zonelist.txt"

# Define file that will contain results of curl command
OUTFILE="result.txt"

# Location of curl on this system.  Use -s so curl is silent
CURL="/usr/bin/curl -s"

# WAPI version
VERSION="v2.3"

# What view are these zone in?
VIEW="default"

############################################
# No more variables to set below this line #
############################################

# Process the zonelist file one line at a time
while read ZONE
do

   echo
   echo
   echo
   echo
   echo
   echo "Processing zone:    $ZONE"

   # Create CSV file for this zone
   $CURL \
      --tlsv1 \
      --insecure \
      -u "$USERNAME:$PASSWORD" \
      -H "Content-Type: application/json" \
      -X POST https://$SERVER/wapi/$VERSION/fileop?_function=csv_export \
      -d "{\"_object\":\"allrecords\",\"view\":\"$VIEW\",\"zone\":\"$ZONE\"}" \
      > $OUTFILE

   ERROR_COUNT=`grep -c Error $OUTFILE`
   if [ $ERROR_COUNT -gt 0 ]; then
      # Display the error and skip rest of loop
      grep Error $OUTFILE
      continue
   fi

   # Get the "token" and "download URL" for later use
   TOKEN=`grep "token" $OUTFILE | cut -d"\"" -f4`
   URL=`  grep "url"   $OUTFILE | cut -d"\"" -f4`
   echo "Token:              $TOKEN"
   echo "URL:                $URL"

   # Download the CSV file
   $CURL \
      -k1 \
      -u "$USERNAME:$PASSWORD" \
      -H "Content-Type: application/force-download" \
      -O $URL

   # Rename CSV file so the file name matches the zone name
   FILENAME="$ZONE.csv"
   # Reverse zones will contain the / character which will be interpreted
   # as a directory delimiter if included in file name.  Replace with +
   FILENAME=`echo $FILENAME | tr \/ +`
   echo "Filename:           $FILENAME"
   mv Zonechilds.csv $FILENAME

   # Let NIOS know download is complete
   $CURL \
      -k1 \
      -u "$USERNAME:$PASSWORD" \
      -H "Content-Type: application/json" \
      -X POST https://$SERVER/wapi/$VERSION/fileop?_function=downloadcomplete \
      -d "{ \"token\": \"$TOKEN\"}"

done < "$ZONELIST"

exit

Re: Automate copying zones/records to a different DNS view

verne
Techie
Posts: 12
5318     0

did you get the bugs worked out of your upload script ?

 

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

Yes, I did get the bugs worked out.  I plan on posting all my scripts in this forum. 

Re: Automate copying zones/records to a different DNS view

Authority
Posts: 31
5318     0

Hey Clark...

Just curious if you know of anything changing in curl that would affect your script.  I've been trying to get it to work this past week and just not having any luck.  I keep getting the following errors

 

Processing zone:    1800thezah.com
curl: no URL specified!
curl: try 'curl --help' or 'curl --manual' for more information
./ibzoneXport.sh: line 52: -u: command not found
Token:              
URL:                
curl: no URL specified!
curl: try 'curl --help' or 'curl --manual' for more information
Filename:           1800thezah.com.csv
mv: cannot stat 'Zonechilds.csv': No such file or directory
<HTML><HEAD>
<TITLE>Network Error</TITLE>
</HEAD>
<BODY>
<FONT face="Helvetica">
<big><strong></strong></big><BR>
</FONT>
<blockquote>
<TABLE border=0 cellPadding=1 width="80%">
<TR><TD>
<FONT face="Helvetica">
<big>Network Error (dns_unresolved_hostname)</big>
<BR>
<BR>
</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica">
Your requested host "gridprod.thezah.corp" could not be resolved by DNS.
</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica">

</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica" SIZE=2>
<BR>

</FONT>
</TD></TR>
</TABLE>
</blockquote>
</FONT>
</BODY></HTML>

 

 

Here's the variables if it helps, otherwise its identical to what you have above

#!/bin/bash -

# This script reads a file, zonelist.txt, that contains a 
# list of zone files to download.  A separate csv file will be
# created for each zone.  The csv file will be named the
# same as the zone name.  Minimal error checking is
# performed.  Use at your own risk.
# All files are located in the same directory as this script.

# Username and password with permission to download csv files
USERNAME="dennis"
PASSWORD="superpass"

# Grid Master
SERVER="gridprod.thezah.corp"

# Define file containing list of zones to export
ZONELIST="zahzones.txt"

# Define file that will contain results of curl command
OUTFILE="zahresult.txt"

# Location of curl on this system.  Use -s so curl is silent
CURL="/usr/bin/curl -s"

# WAPI version
VERSION="v2.3"

# What view are these zone in?
VIEW="External DNS View"

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

In the error msg it says "no url specified" so I would check your variables to see if they contain what you think they contain.  I had this same problem when I was writing the script and it turned out one of my variables was not set.  Just before the curl command I used the echo command to list all my variables and I found the problem.

Re: Automate copying zones/records to a different DNS view

[ Edited ]
Guru
Posts: 54
5318     0

.

Re: Automate copying zones/records to a different DNS view

[ Edited ]
Guru
Posts: 54
5318     0

Another thing I would look at is to check if any of your variables contain a space.  If they contain spaces it will invalidate the URL.   The $VIEW variable contains spaces but you may have just done that for an example view name.  If you need to use spaces you have to escape them.

Re: Automate copying zones/records to a different DNS view

[ Edited ]
Guru
Posts: 54
5318     0

Do you need me to post the upload script?

Re: Automate copying zones/records to a different DNS view

verne
Techie
Posts: 12
5318     0

sorry for the late reply ...

 

YES, having the upload script will be useful

 

Re: Automate copying zones/records to a different DNS view

Guru
Posts: 54
5318     0

Here is the upload script I am using to upload and import CSV files.  I have changed the philosophy behind the way I'm writing these Infoblox scripts.  For processes like uploading and download a CSV file I am writing one script that takes command line input and performs that one upload or download.  If I want to batch upload or batch download a lot of CSV files I am wrapping that "single use" script in another BASH shell script and iterating through a list of CSV files.  The advantage to doing this is that I only have to maintain one script to upload and one script to download CSV files.  As I find bugs or  imporvements, I only have to make changes in one place.

 

Disclaimer - I am happy to share my scripts with others.  However, I have not tested these scripts in every possible scenario.  My scripts could possibly contain bugs that I have not found because my use case is different from yours.  It is possible that some undiscovered bug in my script could damage your production system.  If you use any of my scripts please make sure you understand what they are doing before you use them.  If you don't understand BASH shell scripting, don't use my scripts.  Since this particular script accepts command line arguments, and one of those arguments is a password, this could be a potential security issue for your company because the command may be written to your BASH history file.  Use at your own risk.

 

All my scripts were written on Ununtu 14 and tested against NIOS 7.3.8 in a development environment.  I have not done extensive error checking and data validation for the input.  These are quick and dirty scripts meant to be run agains a development system. 

 

At some point this summer I will clean these scripts up with better comments and better error checking.

 

#!/bin/bash

# This script is for development use only and not
# intended to be used on a production system.
# Do not use this script unless you are familiar with BASH
# shell programming.  Use at your own risk.
#
# This script will upload and import a csv file.
#
# The error checking that is done is only to determine
# if the curl command was successful.  No error checking is
# done on the Infoblox side to see if the csv file was
# imported correctly.
#
# This is a reusable BASH function to check
# for errors after each curl command is run
check_for_errors()
{
   ERROR_COUNT=`grep -c "Error" $OUTFILE`
   AUTH_ERROR=`grep -c "401 Authorization Required" $OUTFILE`
   TOTAL_ERRORS=`expr "$ERROR_COUNT" + "$AUTH_ERROR" + "$EXIT_CODE"`
   echo
   echo "Total errors: $TOTAL_ERRORS"
   echo
   if [ $TOTAL_ERRORS -gt 0 ]; then
      echo
      echo "Errors detected . . . exiting"
      echo "Errors found in returned document: $ERROR_COUNT"
      echo "Authentication errors:             $AUTH_ERROR"
      echo "Curl exit code was:                $EXIT_CODE"
      echo
      cat $OUTFILE
      exit
   fi
}

print_usage()
{
   echo
   echo
   echo
   echo "Usage: $0 -h hostname -u username -p password -f filename"
   echo
   echo "NOTE: Passwords containing special character may need to be wrapped in single quotes."
   echo
   echo
   echo
   exit
}

# End of functions

########################
########################
## SET VARIABLES HERE ##
########################
########################

# Output file that will contain the results of the curl command
# If this file does not exist, it will be created automatically
OUTFILE="result.txt"

# Full path to the curl command - add any options here
# The -s option makes curl silent
CURL="/usr/bin/curl -v"

# Location of curl's cookie file - use full path to be safe
# If this file does not exist, it will be created automatically
COOKIEFILE="cookies.txt"

# Infoblox RESTful API version
# Make sure to include the letter "v" eg. "v2.3"
VERSION="v2.3"

##########################
##########################
## END OF SET VARIABLES ##
##########################
##########################

# Make sure exactly eight arguments were passed to the script
# Four switches and four values equals eight
if [[ $# -ne 8 ]]; then
   echo "Number of arguments = $#"
   print_usage
fi

# Get the command line arguments
while [[ $# -gt 1 ]]
do
   key="$1"
   case $key in
   -h)
      HOST="$2"
      shift # past argument
   ;;
   -u)
      USERNAME="$2"
      shift # past argument
   ;;
   -p)
      PASSWORD="$2"
      shift # past argument
   ;;
   -f)
      CSVFILE="$2"
      shift # past argument
   ;;
   *)
      # unknown option
      print_usage
   ;;
   esac
   shift # past argument or value
done

# Comment the following echo commands if you don't want to do any debugging
# This will display the password to the screen which might be handy
# since some special characters int he password could break the script
echo
echo "Host:     $HOST"
echo "Username: $USERNAME"
echo "Password: $PASSWORD"
echo "CSV file: $CSVFILE"

# See if the specified CSV file exists
if [ ! -e "$CSVFILE" ]; then
   echo "File does not exist: $CSVFILE"
   echo
   exit
fi

# See if file is empty
if [ ! -s "$CSVFILE" ]; then
   echo "File $CSVFILE is empty"
   echo
   exit
fi

echo
echo "#######################"
echo "# Processing csv file # $CSVFILE"
echo "#######################"
echo

# Get the csv file name without the path
BASE_CSVFILE=`basename $CSVFILE`

# Iniatiate a CSV file upload - get upload URL and token from server
# It will be in $OUTFILE
echo "Initializing CSV upload"
$CURL \
   --dump-header $COOKIEFILE \
   --insecure \
   -u "$USERNAME:$PASSWORD" \
   -H "Content-Type: application/json" \
   -X POST https://$HOST/wapi/$VERSION/fileop?_function=uploadinit \
   -d "{ \"filename\":\"$BASE_CSVFILE\" }" > $OUTFILE
EXIT_CODE=$?
check_for_errors

# Get the "token" and "download URL" for later use
TOKEN=`grep "token" $OUTFILE | cut -d"\"" -f4`
URL=`  grep "url"   $OUTFILE | cut -d"\"" -f4`

# Uncomment to see token and URL displayed on screen
# echo "Token:              $TOKEN"
# "URL:                $URL"

echo

# Upload the CSV file
# To debug data sent to host, add "--trace-ascii /dev/stdout" to
# the end of the curl command
echo "Uploading CSV data"
$CURL \
   --cookie $COOKIEFILE \
   --insecure \
   -F name=$BASE_CSVFILE \
   -F filedata=@$CSVFILE \
   -X POST $URL > $OUTFILE
EXIT_CODE=$?
check_for_errors

echo

# Start the CSV import job
echo "Starting CSV import job"
$CURL \
   --cookie $COOKIEFILE \
   --insecure \
   -H "Content-Type: application/json" \
   -X POST https://$HOST/wapi/$VERSION/fileop?_function=csv_import \
   -d "{ \"action\":\"START\", \"operation\":\"INSERT\",\"token\":\"$TOKEN\",\"on_error\":\"CONTINUE\" }" > $OUTFILE
EXIT_CODE=$?
check_for_errors
cat $OUTFILE

echo
echo
echo

exit
Showing results for 
Search instead for 
Do you mean 

Recommended for You