|Download web pages and append them to master file|
|Date Updated - 07/22/2007|
I have a task that visits a url containing a text file every
10mins and it saves it in a directory with the DATE and TIME format as its name...what I need to do is setup
a task that will go to that directory say every 60mins, merge only the files with today's DATE and TIME and
save the results to another file say for example results.txt
Thanks for evaluating Automize. You would have to insert/append the new downloaded file to a BASE file using the File Variable task. Additionally, you would have to download the webpages (textfiles) to a staging folder which contains the BASE file. After the new file is inserted/appended into the BASE file, you would have to move the webpage (textfile) to the final storage folder.
To do this, you would have to create the following tasks:
1) your web task
Download folder = [Staging folder] which contains BASE file.
2) A File Monitor task (Title = FileMon). This simply gets the name of the last downloaded file.
Directory = [Staging folder]
Filename = none
Task To Run = NONE
Select Option = Run Task if file exists
The FileMonitor task will now contain the variable $%FileMon::FileName%$ which is the name of the last downloaded web page.
3) a File Variable task to append file to the BASE file
variable name = $%FILE::LINE::MID::0::10000::full_BASE_file_path%$
File Path = [Staging Folder]\$%FileMon::FileName%$
Select Filing options as needed
4) a Copy task - This moves the file from the staging folder to a final folder. This is required for the FileMonitor task to work correctly in the next scheduled run.
Option = Move
Source Folder = [Staging Folder]
Target Folder = [Final Folder]
Filename = $%FileMon::FileName%$
5) a chain task (use big blue links icon on front panel).
Step 1 = Web Task
Step 2 = File Monitor task
Step 3 = File Variable task
Step 4 = Copy task
6) Instead of scheduling the web task to run every X minutes, schedule this chain every X minutes.