Use scp to copy files of specific extension from directory The 2019 Stack Overflow Developer Survey Results Are InWhy *not* parse `ls` (and what do to instead)?How to download the latest file in a folder using a scp?How to use rsync or scp to efficiently copy the files from machineB and machineC to machineA?Concatenate all commentary about source files in a directory treeUsing scp to transfer files to an android devicersync rules to only sync specific sub-dirs in long pathsCopying latest file from remote serverscp can't upload/download files even though ssh worksHow to copy with scp a file that case is unknown via bash scriptscp permission denied for single fileError copying directory from a remote host to another remote host with 'scp'move file after validation

Falsification in Math vs Science

Why don't Unix/Linux systems traverse through directories until they find the required version of a linked library?

Why Did Howard Stark Use All The Vibranium They Had On A Prototype Shield?

Why do UK politicians seemingly ignore opinion polls on Brexit?

Realistic Alternatives to Dust: What Else Could Feed a Plankton Bloom?

Confusion about non-derivable continuous functions

Why could you hear an Amstrad CPC working?

How to make payment on the internet without leaving a money trail?

Is this food a bread or a loaf?

If a poisoned arrow's piercing damage is reduced to 0, do you still get poisoned?

Manuscript was "unsubmitted" because the manuscript was deposited in Arxiv Preprints

What is the use of option -o in the useradd command?

What does "sndry explns" mean in one of the Hitchhiker's guide books?

Are USB sockets on wall outlets live all the time, even when the switch is off?

What are the motivations for publishing new editions of an existing textbook, beyond new discoveries in a field?

What do hard-Brexiteers want with respect to the Irish border?

Springs with some finite mass

What does "rabbited" mean/imply in this sentence?

Output the Arecibo Message

What is the steepest angle that a canal can be traversable without locks?

Why can Shazam do this?

How to create dashed lines/arrows in Illustrator

Is there a name of the flying bionic bird?

"To split hairs" vs "To be pedantic"



Use scp to copy files of specific extension from directory



The 2019 Stack Overflow Developer Survey Results Are InWhy *not* parse `ls` (and what do to instead)?How to download the latest file in a folder using a scp?How to use rsync or scp to efficiently copy the files from machineB and machineC to machineA?Concatenate all commentary about source files in a directory treeUsing scp to transfer files to an android devicersync rules to only sync specific sub-dirs in long pathsCopying latest file from remote serverscp can't upload/download files even though ssh worksHow to copy with scp a file that case is unknown via bash scriptscp permission denied for single fileError copying directory from a remote host to another remote host with 'scp'move file after validation



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








-1















I'm writing a bash script that needs to fetch all *_out.csv from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/. I can find all my files by going:



find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"



This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:



$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .


How would I grab the files I need from there?



The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.










share|improve this question









New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

    – Jeff Schaller
    Apr 5 at 20:11











  • @JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

    – dylanjm
    Apr 5 at 20:13












  • is zsh available on $server?

    – Jeff Schaller
    Apr 5 at 20:14











  • @JeffSchaller It appears so, but it's not really setup (no .zshrc files).

    – dylanjm
    Apr 5 at 20:15












  • and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

    – Jeff Schaller
    Apr 5 at 20:20

















-1















I'm writing a bash script that needs to fetch all *_out.csv from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/. I can find all my files by going:



find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"



This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:



$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .


How would I grab the files I need from there?



The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.










share|improve this question









New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

    – Jeff Schaller
    Apr 5 at 20:11











  • @JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

    – dylanjm
    Apr 5 at 20:13












  • is zsh available on $server?

    – Jeff Schaller
    Apr 5 at 20:14











  • @JeffSchaller It appears so, but it's not really setup (no .zshrc files).

    – dylanjm
    Apr 5 at 20:15












  • and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

    – Jeff Schaller
    Apr 5 at 20:20













-1












-1








-1








I'm writing a bash script that needs to fetch all *_out.csv from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/. I can find all my files by going:



find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"



This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:



$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .


How would I grab the files I need from there?



The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.










share|improve this question









New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












I'm writing a bash script that needs to fetch all *_out.csv from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/. I can find all my files by going:



find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"



This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:



$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .


How would I grab the files I need from there?



The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.







bash shell-script rsync scp






share|improve this question









New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited Apr 6 at 19:31









ctrl-alt-delor

12.4k52662




12.4k52662






New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked Apr 5 at 19:50









dylanjmdylanjm

1014




1014




New contributor




dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

    – Jeff Schaller
    Apr 5 at 20:11











  • @JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

    – dylanjm
    Apr 5 at 20:13












  • is zsh available on $server?

    – Jeff Schaller
    Apr 5 at 20:14











  • @JeffSchaller It appears so, but it's not really setup (no .zshrc files).

    – dylanjm
    Apr 5 at 20:15












  • and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

    – Jeff Schaller
    Apr 5 at 20:20

















  • To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

    – Jeff Schaller
    Apr 5 at 20:11











  • @JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

    – dylanjm
    Apr 5 at 20:13












  • is zsh available on $server?

    – Jeff Schaller
    Apr 5 at 20:14











  • @JeffSchaller It appears so, but it's not really setup (no .zshrc files).

    – dylanjm
    Apr 5 at 20:15












  • and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

    – Jeff Schaller
    Apr 5 at 20:20
















To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

– Jeff Schaller
Apr 5 at 20:11





To clarify, you want the directory under $dir that has the most recent ... name? timestamp? You hard-coded ox_20190404 in the lead-up, so it's not clear how you selected it.

– Jeff Schaller
Apr 5 at 20:11













@JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

– dylanjm
Apr 5 at 20:13






@JeffSchaller Suppose I'm ssh'd into the server. If I type ls -t /projects/ox/git | head -1 then ox_20190404 is the directory that is returned. I then want to go inside that folder and get the files from there.

– dylanjm
Apr 5 at 20:13














is zsh available on $server?

– Jeff Schaller
Apr 5 at 20:14





is zsh available on $server?

– Jeff Schaller
Apr 5 at 20:14













@JeffSchaller It appears so, but it's not really setup (no .zshrc files).

– dylanjm
Apr 5 at 20:15






@JeffSchaller It appears so, but it's not really setup (no .zshrc files).

– dylanjm
Apr 5 at 20:15














and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

– Jeff Schaller
Apr 5 at 20:20





and so the final scp command would explicitly list all of the *_out.csv files underneath the most recent directory under $dir in order to be copied locally?

– Jeff Schaller
Apr 5 at 20:20










3 Answers
3






active

oldest

votes


















1














I've adjusted some of your variable names a bit.



Surely there are better ways to do this than something dangerous like parsing the output of ls, but see whether this works for you:



$ pth="/projects/ox/git"
$ server="myusername@server"
$ dir="$(ssh $server "ls -t "$pth" | head -1")"
$ mkdir -p "$pth/$dir"
$ scp -p $server:"$pth/$dir"/'*_out.csv' "$pth/$dir"/


Once dir has been set to the newest remote directory, mkdir -p is used to ensure that the same directory name exists locally. Then scp the files into a local directory with the same path and name as the remote directory. I was looking for an rsync solution, but couldn't think of one.






share|improve this answer























  • Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

    – dylanjm
    yesterday











  • As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

    – Jim L.
    yesterday












  • Please see my answer as I think it will shed more light on what I was trying to do.

    – dylanjm
    11 hours ago


















0














This will find the most recently modified (created) directory, assuming that the directory name does not contain a newline (n)



newest=$(
ssh -qn REMOTE 'find ./* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
sort -t$'t' -r -nk1,2 |
head -n1 |
cut -f2-
)


If you can guarantee that the target contains only directories of interest you can simplify it considerably (again, bearing in mind the newline issue)



newest=$(ssh -qn REMOTE ls -t | head -n1)


You can copy an entire tree of files using scp, but if you want to filter it you'll probably be better off using rsync



rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE:"$newest" "$newest"


If you're keeping the previous set of files locally and you really just wanted to add the latest set without copying the previous ones, rsync can do that too



rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE: .





share|improve this answer
































    0














    This is the code that ended up working for me. I might not have described my question perfectly, but I wasn't having trouble finding the most recently changed directory. My problem was then finding all the files in that directory and ensuring they ended up in the right place on my local machine. Here is the bash script to do so:



    # Grab the most recently updated ox file off of server; return as string
    # in the form of ox_XXXXXXXX/assessment/LWR/validation/*
    newest=$(
    ssh -qn username@server 'find /projects/ox/git/* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
    sort -t$'t' -r -nk1,2 |
    head -n1 |
    cut -f2- |
    awk 'print "/projects/ox/git/"$1"/assessment/LWR/validation/HBEP/analysis/BK363/*"'
    )

    # Take $newest and find all associated *_out.csv files beneath that directory
    newestf=$(
    ssh -qn username@server "find $newest -type f -name '*_out.csv'"
    )

    # Write these filepaths to a .csv on the local machine
    echo "$newestf" | tr " " "n" remote_fp.csv

    # Run Rscript to parse and transform the filepaths so they go to the write place on local machine
    Rscript ~/transform_fp.R

    # Read content from .csv remote file paths - we'll need these to actually pull the files using scp
    get_scp_fp=$(awk -F ""*,"*" 'print $1' ~/remote_fp.csv)

    # Read content from .csv local file paths - we'll need these to actually write the data locally
    get_local_fp=$(awk -F ""*,"*" 'print $1' ~/local_fp.csv)

    # Loop through file paths and pull data from remote to local.
    for i in $get_scp_fp; do
    for j in $get_local_fp; do
    scp -p username@server:"$i" "$j"
    done
    done


    Rscript:



    suppressPackageStartupMessages(library(tidyverse))

    test <- read_csv("remote_fp.csv", col_names = FALSE)

    str_replace_all(test$X1, "/projects/ox/git/ox_[0-9]8", "~/Documents/projects/ox") %>%
    str_replace_all("(?:analysis).*$", paste0("doc/figures/", basename(.))) %>%
    tibble() %>%
    write_csv(path = "~/local_fp.csv", col_names = FALSE)





    share|improve this answer










    New contributor




    dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.




















      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "106"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );






      dylanjm is a new contributor. Be nice, and check out our Code of Conduct.









      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f510795%2fuse-scp-to-copy-files-of-specific-extension-from-directory%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1














      I've adjusted some of your variable names a bit.



      Surely there are better ways to do this than something dangerous like parsing the output of ls, but see whether this works for you:



      $ pth="/projects/ox/git"
      $ server="myusername@server"
      $ dir="$(ssh $server "ls -t "$pth" | head -1")"
      $ mkdir -p "$pth/$dir"
      $ scp -p $server:"$pth/$dir"/'*_out.csv' "$pth/$dir"/


      Once dir has been set to the newest remote directory, mkdir -p is used to ensure that the same directory name exists locally. Then scp the files into a local directory with the same path and name as the remote directory. I was looking for an rsync solution, but couldn't think of one.






      share|improve this answer























      • Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

        – dylanjm
        yesterday











      • As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

        – Jim L.
        yesterday












      • Please see my answer as I think it will shed more light on what I was trying to do.

        – dylanjm
        11 hours ago















      1














      I've adjusted some of your variable names a bit.



      Surely there are better ways to do this than something dangerous like parsing the output of ls, but see whether this works for you:



      $ pth="/projects/ox/git"
      $ server="myusername@server"
      $ dir="$(ssh $server "ls -t "$pth" | head -1")"
      $ mkdir -p "$pth/$dir"
      $ scp -p $server:"$pth/$dir"/'*_out.csv' "$pth/$dir"/


      Once dir has been set to the newest remote directory, mkdir -p is used to ensure that the same directory name exists locally. Then scp the files into a local directory with the same path and name as the remote directory. I was looking for an rsync solution, but couldn't think of one.






      share|improve this answer























      • Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

        – dylanjm
        yesterday











      • As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

        – Jim L.
        yesterday












      • Please see my answer as I think it will shed more light on what I was trying to do.

        – dylanjm
        11 hours ago













      1












      1








      1







      I've adjusted some of your variable names a bit.



      Surely there are better ways to do this than something dangerous like parsing the output of ls, but see whether this works for you:



      $ pth="/projects/ox/git"
      $ server="myusername@server"
      $ dir="$(ssh $server "ls -t "$pth" | head -1")"
      $ mkdir -p "$pth/$dir"
      $ scp -p $server:"$pth/$dir"/'*_out.csv' "$pth/$dir"/


      Once dir has been set to the newest remote directory, mkdir -p is used to ensure that the same directory name exists locally. Then scp the files into a local directory with the same path and name as the remote directory. I was looking for an rsync solution, but couldn't think of one.






      share|improve this answer













      I've adjusted some of your variable names a bit.



      Surely there are better ways to do this than something dangerous like parsing the output of ls, but see whether this works for you:



      $ pth="/projects/ox/git"
      $ server="myusername@server"
      $ dir="$(ssh $server "ls -t "$pth" | head -1")"
      $ mkdir -p "$pth/$dir"
      $ scp -p $server:"$pth/$dir"/'*_out.csv' "$pth/$dir"/


      Once dir has been set to the newest remote directory, mkdir -p is used to ensure that the same directory name exists locally. Then scp the files into a local directory with the same path and name as the remote directory. I was looking for an rsync solution, but couldn't think of one.







      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered Apr 5 at 21:19









      Jim L.Jim L.

      1513




      1513












      • Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

        – dylanjm
        yesterday











      • As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

        – Jim L.
        yesterday












      • Please see my answer as I think it will shed more light on what I was trying to do.

        – dylanjm
        11 hours ago

















      • Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

        – dylanjm
        yesterday











      • As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

        – Jim L.
        yesterday












      • Please see my answer as I think it will shed more light on what I was trying to do.

        – dylanjm
        11 hours ago
















      Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

      – dylanjm
      yesterday





      Your answer gets me most of the way there, but is there a recursive flag we can set to get all the *_out.csv files? The files I want aren't directly in $dir but scattered about inside that folder. Do you get what I mean?

      – dylanjm
      yesterday













      As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

      – Jim L.
      yesterday






      As an exercise to build your skills, can you craft a find command line to run on the remote server that will create a list of all of the remote server's *_out.csv files that are in or below $pth/$dir? Then capture that list into a temp file foo or some better name, and then rsync -av $server: "$pth/$dir" --files-from foo Then update your question to fully describe the NEW steps you're taking, and describe what part of the task you're still missing.

      – Jim L.
      yesterday














      Please see my answer as I think it will shed more light on what I was trying to do.

      – dylanjm
      11 hours ago





      Please see my answer as I think it will shed more light on what I was trying to do.

      – dylanjm
      11 hours ago













      0














      This will find the most recently modified (created) directory, assuming that the directory name does not contain a newline (n)



      newest=$(
      ssh -qn REMOTE 'find ./* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
      sort -t$'t' -r -nk1,2 |
      head -n1 |
      cut -f2-
      )


      If you can guarantee that the target contains only directories of interest you can simplify it considerably (again, bearing in mind the newline issue)



      newest=$(ssh -qn REMOTE ls -t | head -n1)


      You can copy an entire tree of files using scp, but if you want to filter it you'll probably be better off using rsync



      rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE:"$newest" "$newest"


      If you're keeping the previous set of files locally and you really just wanted to add the latest set without copying the previous ones, rsync can do that too



      rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE: .





      share|improve this answer





























        0














        This will find the most recently modified (created) directory, assuming that the directory name does not contain a newline (n)



        newest=$(
        ssh -qn REMOTE 'find ./* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
        sort -t$'t' -r -nk1,2 |
        head -n1 |
        cut -f2-
        )


        If you can guarantee that the target contains only directories of interest you can simplify it considerably (again, bearing in mind the newline issue)



        newest=$(ssh -qn REMOTE ls -t | head -n1)


        You can copy an entire tree of files using scp, but if you want to filter it you'll probably be better off using rsync



        rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE:"$newest" "$newest"


        If you're keeping the previous set of files locally and you really just wanted to add the latest set without copying the previous ones, rsync can do that too



        rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE: .





        share|improve this answer



























          0












          0








          0







          This will find the most recently modified (created) directory, assuming that the directory name does not contain a newline (n)



          newest=$(
          ssh -qn REMOTE 'find ./* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
          sort -t$'t' -r -nk1,2 |
          head -n1 |
          cut -f2-
          )


          If you can guarantee that the target contains only directories of interest you can simplify it considerably (again, bearing in mind the newline issue)



          newest=$(ssh -qn REMOTE ls -t | head -n1)


          You can copy an entire tree of files using scp, but if you want to filter it you'll probably be better off using rsync



          rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE:"$newest" "$newest"


          If you're keeping the previous set of files locally and you really just wanted to add the latest set without copying the previous ones, rsync can do that too



          rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE: .





          share|improve this answer















          This will find the most recently modified (created) directory, assuming that the directory name does not contain a newline (n)



          newest=$(
          ssh -qn REMOTE 'find ./* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
          sort -t$'t' -r -nk1,2 |
          head -n1 |
          cut -f2-
          )


          If you can guarantee that the target contains only directories of interest you can simplify it considerably (again, bearing in mind the newline issue)



          newest=$(ssh -qn REMOTE ls -t | head -n1)


          You can copy an entire tree of files using scp, but if you want to filter it you'll probably be better off using rsync



          rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE:"$newest" "$newest"


          If you're keeping the previous set of files locally and you really just wanted to add the latest set without copying the previous ones, rsync can do that too



          rsync -av --include '*/' --include '*_out.csv' --exclude '*' --prune-empty-dirs REMOTE: .






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Apr 5 at 22:15

























          answered Apr 5 at 21:29









          roaimaroaima

          46k758124




          46k758124





















              0














              This is the code that ended up working for me. I might not have described my question perfectly, but I wasn't having trouble finding the most recently changed directory. My problem was then finding all the files in that directory and ensuring they ended up in the right place on my local machine. Here is the bash script to do so:



              # Grab the most recently updated ox file off of server; return as string
              # in the form of ox_XXXXXXXX/assessment/LWR/validation/*
              newest=$(
              ssh -qn username@server 'find /projects/ox/git/* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
              sort -t$'t' -r -nk1,2 |
              head -n1 |
              cut -f2- |
              awk 'print "/projects/ox/git/"$1"/assessment/LWR/validation/HBEP/analysis/BK363/*"'
              )

              # Take $newest and find all associated *_out.csv files beneath that directory
              newestf=$(
              ssh -qn username@server "find $newest -type f -name '*_out.csv'"
              )

              # Write these filepaths to a .csv on the local machine
              echo "$newestf" | tr " " "n" remote_fp.csv

              # Run Rscript to parse and transform the filepaths so they go to the write place on local machine
              Rscript ~/transform_fp.R

              # Read content from .csv remote file paths - we'll need these to actually pull the files using scp
              get_scp_fp=$(awk -F ""*,"*" 'print $1' ~/remote_fp.csv)

              # Read content from .csv local file paths - we'll need these to actually write the data locally
              get_local_fp=$(awk -F ""*,"*" 'print $1' ~/local_fp.csv)

              # Loop through file paths and pull data from remote to local.
              for i in $get_scp_fp; do
              for j in $get_local_fp; do
              scp -p username@server:"$i" "$j"
              done
              done


              Rscript:



              suppressPackageStartupMessages(library(tidyverse))

              test <- read_csv("remote_fp.csv", col_names = FALSE)

              str_replace_all(test$X1, "/projects/ox/git/ox_[0-9]8", "~/Documents/projects/ox") %>%
              str_replace_all("(?:analysis).*$", paste0("doc/figures/", basename(.))) %>%
              tibble() %>%
              write_csv(path = "~/local_fp.csv", col_names = FALSE)





              share|improve this answer










              New contributor




              dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.
























                0














                This is the code that ended up working for me. I might not have described my question perfectly, but I wasn't having trouble finding the most recently changed directory. My problem was then finding all the files in that directory and ensuring they ended up in the right place on my local machine. Here is the bash script to do so:



                # Grab the most recently updated ox file off of server; return as string
                # in the form of ox_XXXXXXXX/assessment/LWR/validation/*
                newest=$(
                ssh -qn username@server 'find /projects/ox/git/* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
                sort -t$'t' -r -nk1,2 |
                head -n1 |
                cut -f2- |
                awk 'print "/projects/ox/git/"$1"/assessment/LWR/validation/HBEP/analysis/BK363/*"'
                )

                # Take $newest and find all associated *_out.csv files beneath that directory
                newestf=$(
                ssh -qn username@server "find $newest -type f -name '*_out.csv'"
                )

                # Write these filepaths to a .csv on the local machine
                echo "$newestf" | tr " " "n" remote_fp.csv

                # Run Rscript to parse and transform the filepaths so they go to the write place on local machine
                Rscript ~/transform_fp.R

                # Read content from .csv remote file paths - we'll need these to actually pull the files using scp
                get_scp_fp=$(awk -F ""*,"*" 'print $1' ~/remote_fp.csv)

                # Read content from .csv local file paths - we'll need these to actually write the data locally
                get_local_fp=$(awk -F ""*,"*" 'print $1' ~/local_fp.csv)

                # Loop through file paths and pull data from remote to local.
                for i in $get_scp_fp; do
                for j in $get_local_fp; do
                scp -p username@server:"$i" "$j"
                done
                done


                Rscript:



                suppressPackageStartupMessages(library(tidyverse))

                test <- read_csv("remote_fp.csv", col_names = FALSE)

                str_replace_all(test$X1, "/projects/ox/git/ox_[0-9]8", "~/Documents/projects/ox") %>%
                str_replace_all("(?:analysis).*$", paste0("doc/figures/", basename(.))) %>%
                tibble() %>%
                write_csv(path = "~/local_fp.csv", col_names = FALSE)





                share|improve this answer










                New contributor




                dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                Check out our Code of Conduct.






















                  0












                  0








                  0







                  This is the code that ended up working for me. I might not have described my question perfectly, but I wasn't having trouble finding the most recently changed directory. My problem was then finding all the files in that directory and ensuring they ended up in the right place on my local machine. Here is the bash script to do so:



                  # Grab the most recently updated ox file off of server; return as string
                  # in the form of ox_XXXXXXXX/assessment/LWR/validation/*
                  newest=$(
                  ssh -qn username@server 'find /projects/ox/git/* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
                  sort -t$'t' -r -nk1,2 |
                  head -n1 |
                  cut -f2- |
                  awk 'print "/projects/ox/git/"$1"/assessment/LWR/validation/HBEP/analysis/BK363/*"'
                  )

                  # Take $newest and find all associated *_out.csv files beneath that directory
                  newestf=$(
                  ssh -qn username@server "find $newest -type f -name '*_out.csv'"
                  )

                  # Write these filepaths to a .csv on the local machine
                  echo "$newestf" | tr " " "n" remote_fp.csv

                  # Run Rscript to parse and transform the filepaths so they go to the write place on local machine
                  Rscript ~/transform_fp.R

                  # Read content from .csv remote file paths - we'll need these to actually pull the files using scp
                  get_scp_fp=$(awk -F ""*,"*" 'print $1' ~/remote_fp.csv)

                  # Read content from .csv local file paths - we'll need these to actually write the data locally
                  get_local_fp=$(awk -F ""*,"*" 'print $1' ~/local_fp.csv)

                  # Loop through file paths and pull data from remote to local.
                  for i in $get_scp_fp; do
                  for j in $get_local_fp; do
                  scp -p username@server:"$i" "$j"
                  done
                  done


                  Rscript:



                  suppressPackageStartupMessages(library(tidyverse))

                  test <- read_csv("remote_fp.csv", col_names = FALSE)

                  str_replace_all(test$X1, "/projects/ox/git/ox_[0-9]8", "~/Documents/projects/ox") %>%
                  str_replace_all("(?:analysis).*$", paste0("doc/figures/", basename(.))) %>%
                  tibble() %>%
                  write_csv(path = "~/local_fp.csv", col_names = FALSE)





                  share|improve this answer










                  New contributor




                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.










                  This is the code that ended up working for me. I might not have described my question perfectly, but I wasn't having trouble finding the most recently changed directory. My problem was then finding all the files in that directory and ensuring they ended up in the right place on my local machine. Here is the bash script to do so:



                  # Grab the most recently updated ox file off of server; return as string
                  # in the form of ox_XXXXXXXX/assessment/LWR/validation/*
                  newest=$(
                  ssh -qn username@server 'find /projects/ox/git/* -mindepth 0 -maxdepth 0 -type d -printf "%T@t%fn"' |
                  sort -t$'t' -r -nk1,2 |
                  head -n1 |
                  cut -f2- |
                  awk 'print "/projects/ox/git/"$1"/assessment/LWR/validation/HBEP/analysis/BK363/*"'
                  )

                  # Take $newest and find all associated *_out.csv files beneath that directory
                  newestf=$(
                  ssh -qn username@server "find $newest -type f -name '*_out.csv'"
                  )

                  # Write these filepaths to a .csv on the local machine
                  echo "$newestf" | tr " " "n" remote_fp.csv

                  # Run Rscript to parse and transform the filepaths so they go to the write place on local machine
                  Rscript ~/transform_fp.R

                  # Read content from .csv remote file paths - we'll need these to actually pull the files using scp
                  get_scp_fp=$(awk -F ""*,"*" 'print $1' ~/remote_fp.csv)

                  # Read content from .csv local file paths - we'll need these to actually write the data locally
                  get_local_fp=$(awk -F ""*,"*" 'print $1' ~/local_fp.csv)

                  # Loop through file paths and pull data from remote to local.
                  for i in $get_scp_fp; do
                  for j in $get_local_fp; do
                  scp -p username@server:"$i" "$j"
                  done
                  done


                  Rscript:



                  suppressPackageStartupMessages(library(tidyverse))

                  test <- read_csv("remote_fp.csv", col_names = FALSE)

                  str_replace_all(test$X1, "/projects/ox/git/ox_[0-9]8", "~/Documents/projects/ox") %>%
                  str_replace_all("(?:analysis).*$", paste0("doc/figures/", basename(.))) %>%
                  tibble() %>%
                  write_csv(path = "~/local_fp.csv", col_names = FALSE)






                  share|improve this answer










                  New contributor




                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  share|improve this answer



                  share|improve this answer








                  edited 11 hours ago





















                  New contributor




                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  answered 11 hours ago









                  dylanjmdylanjm

                  1014




                  1014




                  New contributor




                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.





                  New contributor





                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  dylanjm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.




















                      dylanjm is a new contributor. Be nice, and check out our Code of Conduct.









                      draft saved

                      draft discarded


















                      dylanjm is a new contributor. Be nice, and check out our Code of Conduct.












                      dylanjm is a new contributor. Be nice, and check out our Code of Conduct.











                      dylanjm is a new contributor. Be nice, and check out our Code of Conduct.














                      Thanks for contributing an answer to Unix & Linux Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f510795%2fuse-scp-to-copy-files-of-specific-extension-from-directory%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      -bash, rsync, scp, shell-script

                      Popular posts from this blog

                      Frič See also Navigation menuinternal link

                      Identify plant with long narrow paired leaves and reddish stems Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?What is this plant with long sharp leaves? Is it a weed?What is this 3ft high, stalky plant, with mid sized narrow leaves?What is this young shrub with opposite ovate, crenate leaves and reddish stems?What is this plant with large broad serrated leaves?Identify this upright branching weed with long leaves and reddish stemsPlease help me identify this bulbous plant with long, broad leaves and white flowersWhat is this small annual with narrow gray/green leaves and rust colored daisy-type flowers?What is this chilli plant?Does anyone know what type of chilli plant this is?Help identify this plant

                      fontconfig warning: “/etc/fonts/fonts.conf”, line 100: unknown “element blank” The 2019 Stack Overflow Developer Survey Results Are In“tar: unrecognized option --warning” during 'apt-get install'How to fix Fontconfig errorHow do I figure out which font file is chosen for a system generic font alias?Why are some apt-get-installed fonts being ignored by fc-list, xfontsel, etc?Reload settings in /etc/fonts/conf.dTaking 30 seconds longer to boot after upgrade from jessie to stretchHow to match multiple font names with a single <match> element?Adding a custom font to fontconfigRemoving fonts from fontconfig <match> resultsBroken fonts after upgrading Firefox ESR to latest Firefox