<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.1701technology.com/index.php?action=history&amp;feed=atom&amp;title=RoboCopy_Access_Denied_Report_-_Linux</id>
	<title>RoboCopy Access Denied Report - Linux - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.1701technology.com/index.php?action=history&amp;feed=atom&amp;title=RoboCopy_Access_Denied_Report_-_Linux"/>
	<link rel="alternate" type="text/html" href="https://wiki.1701technology.com/index.php?title=RoboCopy_Access_Denied_Report_-_Linux&amp;action=history"/>
	<updated>2026-05-06T13:46:05Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.34.1</generator>
	<entry>
		<id>https://wiki.1701technology.com/index.php?title=RoboCopy_Access_Denied_Report_-_Linux&amp;diff=137&amp;oldid=prev</id>
		<title>Michael.mast: Created page with &quot;After inheriting a mess of a Windows domain I was promptly tasked with keeping copies of files on remote servers in the datacenter. The weird thing is that the Domain Admins g...&quot;</title>
		<link rel="alternate" type="text/html" href="https://wiki.1701technology.com/index.php?title=RoboCopy_Access_Denied_Report_-_Linux&amp;diff=137&amp;oldid=prev"/>
		<updated>2016-08-03T18:11:34Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;After inheriting a mess of a Windows domain I was promptly tasked with keeping copies of files on remote servers in the datacenter. The weird thing is that the Domain Admins g...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;After inheriting a mess of a Windows domain I was promptly tasked with keeping copies of files on remote servers in the datacenter. The weird thing is that the Domain Admins group does not have full access to all files (This was not done on purpose, just a case of incompetence and Windows ACLs).&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
As robocopy would run from the datacenter reaching out to grab files I would have logs be kept. The logs would detail failed files and directories that were previously successful so I needed to keep an eye on them. Using a linux archive server that had read only access to the file server I was able to write up a quick script that would email me a list of all failed files. It's rough around the edges but it gets the job done.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
##This first line is not necessary if your network and server are good. My environment was pushed to it's limits and performance was really poor so having the files locally was a good idea.&lt;br /&gt;
cp /directory/where/robocopy/logs/are/kept/* /working/directory/for/files&lt;br /&gt;
grep -r -B 1 denied /working/directory/for/files/ &amp;gt; /tmp/deniedlist.txt&lt;br /&gt;
sed -i 's|.*\\\\||' /tmp/deniedlist.txt&lt;br /&gt;
sed -i '/.*workingdir/d' /tmp/deniedlist.txt&lt;br /&gt;
sendmail email@address.send &amp;lt; /tmp/deniedlist.txt&lt;br /&gt;
rm -f /tmp/deniedlist.txt&lt;br /&gt;
rm -f /working/directory/for/files/*&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Michael.mast</name></author>
		
	</entry>
</feed>