6002

I'm trying to find a way to scan my entire Linux system for all files containing a specific string of text. Just to clarify, I'm looking for text within the file, not in the file name.

When I was looking up how to do this, I came across this solution twice:

find / -type f -exec grep -H 'text-to-find-here' {} \;

However, it doesn't work. It seems to display every single file in the system.

Is this close to the proper way to do it? If not, how should I? This ability to find text strings in files would be extraordinarily useful for some programming projects I'm doing.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Nathan
  • 65,860
  • 11
  • 32
  • 47
  • 30
    remember that grep will interpret any `.` as a single-character wildcard, among others. My advice is to alway use either fgrep or egrep. – Walter Tross Oct 28 '13 at 11:54
  • 12
    anyway, you were almost there! Just replace `-H` with `-l` (and maybe `grep` with `fgrep`). To exclude files with certain patterns of names you would use `find` in a more advanced way. It's worthwile to learn to use `find`, though. Just `man find`. – Walter Tross Oct 28 '13 at 12:01
  • 7
    `find … -exec +` is easier to type and faster than `find … -exec \;`. It works only if `` accepts any number of file name arguments. The saving in execution time is especially big if `` is slow to start like Python or Ruby scripts. – hagello Jan 28 '16 at 05:16
  • To search non-recursively in a given path the command is `grep --include=\*.txt -snw "pattern" thepath/*. – Stéphane Laurent Aug 15 '16 at 12:34
  • @StéphaneLaurent I think you are complicating it too much. Just say `grep "pattern" path/*.txt` – fedorqui 'SO stop harming' Dec 02 '16 at 13:13
  • This question should be on Unix-Linux community. – BreakBadSP Jul 25 '18 at 05:46
  • This solution is to find a pattern (regex matching), not a specific text, which might contain symbols that need escaping. – Serge Sep 12 '19 at 14:45
  • Does this answer your question? [How can I use grep to find a word inside a folder?](https://stackoverflow.com/questions/4121803/how-can-i-use-grep-to-find-a-word-inside-a-folder) – NAND May 19 '20 at 20:42
  • grep -nri "stringstrings" /path/ – srpatch Jun 08 '20 at 05:35
  • @kenorb's is, by far, the best answer! https://stackoverflow.com/a/30138655/274502 too bad it didn't get more love. – cregox Sep 29 '20 at 20:39

52 Answers52

11011

Do the following:

grep -rnw '/path/to/somewhere/' -e 'pattern'
  • -r or -R is recursive,
  • -n is line number, and
  • -w stands for match the whole word.
  • -l (lower-case L) can be added to just give the file name of matching files.
  • -e is the pattern used during the search

Along with these, --exclude, --include, --exclude-dir flags could be used for efficient searching:

  • This will only search through those files which have .c or .h extensions:
grep --include=\*.{c,h} -rnw '/path/to/somewhere/' -e "pattern"
  • This will exclude searching all the files ending with .o extension:
grep --exclude=\*.o -rnw '/path/to/somewhere/' -e "pattern"
  • For directories it's possible to exclude one or more directories using the --exclude-dir parameter. For example, this will exclude the dirs dir1/, dir2/ and all of them matching *.dst/:
grep --exclude-dir={dir1,dir2,*.dst} -rnw '/path/to/somewhere/' -e "pattern"

This works very well for me, to achieve almost the same purpose like yours.

For more options check man grep.

Intrastellar Explorer
  • 764
  • 2
  • 14
  • 44
rakib_
  • 113,641
  • 3
  • 15
  • 25
  • 2
    How can I make it so it ignores binary files though? – Nathan Jun 06 '13 at 08:27
  • 82
    use --exclude. like "grep -rnw --exclude=*.o 'directory' -e "pattern" – rakib_ Jun 06 '13 at 08:29
  • if it doesn't work unquote the directory. also, if it's a big directory it may just hang for a sec. it's not instant. – I wrestled a bear once. Oct 30 '14 at 19:05
  • 1
    Doesn't work when trying to find something like "foo@email.com" it will return for "email.com" but as soon as you throw the (At) [@] symbol in the string, grep chokes and returns zip. – Kraang Prime Dec 17 '14 at 15:07
  • It works nice, but if i add "-i" to ignore case, it return some fake result. Anybody encountered the same problem? – eason Jan 23 '15 at 09:53
  • 138
    it's worth noting: it seems the `r` option is lazy (traverses depth-first, than stops after the first directory), while `R` is greedy (will traverse the entire tree correctly). – Eliran Malka Mar 24 '15 at 15:09
  • 1
    Also, note that it is case sensitive. – Nathan Jun 10 '15 at 02:09
  • Hi, a naive question: why you use \*.{..} after -included, but *.{...} after -excluded? – zell Aug 14 '15 at 12:56
  • @zell to indicate the type of file we want to search. From above ex, we only wanted to search files end up with .c and .h extension. – rakib_ Aug 14 '15 at 13:50
  • 7
    *grep -rnw "String I was looking for"* done what I needed. Thanks! – ViliusK Aug 19 '15 at 21:20
  • 2
    Don't forget to to tell Linux to not show you the stuff you don't need by adding "2> /dev/null" to the end of the command. This is is the only to not get a whole lot of Permission Denied warnings that only muddy the results you are looking for. – Justin Oct 15 '15 at 01:36
  • 45
    Note(especially for newbies): The quotation marks in the above command are important. – madD7 Dec 22 '15 at 12:37
  • Thanks for this answer. It seems like the only option that is really necessary is `-r`, though, if the argument is a directory. `grep pattern -r path/to/somewhere` – mkdrive2 Feb 09 '16 at 22:12
  • 3
    You can add --colour/--color to highlight your search term as well. Eg `grep -rnw --colour . -e "terminal"` – mattbell87 Mar 07 '16 at 05:27
  • 1
    @D.7 could you elaborate how they are important? – sbhatla Apr 27 '16 at 17:57
  • @sbhatla if your file name or file path contains white spaces then quotation marks ensure that the file name is read including/along-with white spaces. – madD7 Apr 27 '16 at 18:47
  • 85
    @Eliran Malka `R` en `r` will both traverse directories correctly, but `R` will follow symbolic links. – bzeaman Jul 05 '16 at 08:36
  • 1
    I prefer getting only filenames without the result, so i use `grep -rnwl` – 54l3d Dec 16 '16 at 09:13
  • 3
    Does it search in hidden files/directories within the directory ? – Nagappa L M Dec 27 '16 at 10:38
  • 1
    Your Answer doesn't work for large number of files, tested in AIX for more than 15k files didn't work. Error - "/usr/bin/grep: 0403-027 The parameter list is too long." – VIPIN KUMAR Jan 27 '17 at 14:30
  • On Ubuntu 14.04.3 this didn't worked. I mounted the directory in Samba and it found the text. – machineaddict Feb 22 '17 at 07:57
  • 4
    Can anyone explain why there is a \ in include argument --include=\\*.{c,h}? thanks – Lion Lai Mar 24 '17 at 02:52
  • 2
    I find the `-i` option very useful too to "ignore case". Maybe throw this in your list of options. – Gabriel Staples Mar 25 '17 at 01:37
  • 3
    What does the -e stand for? – Gary Apr 04 '17 at 13:20
  • 2
    This is a solid answer, but the issue is that it will only match whole words because of the `-w` parameter. I found that it will not match arbitrary text. – entpnerd May 03 '17 at 17:24
  • 1
    I thing I would like to add to this answer is that '/path/to/somewhere/' cannot be relative path. It should be absolute path from your / directory. – Sunil Kumar May 05 '17 at 06:50
  • I used your command in function withing .bashrc findin(){ grep -rnw "$2" -e "$1" } – talsibony Sep 15 '17 at 11:12
  • --include-dir flag throws an error and is not mentioned in the grep man page (at least not on my centos nor debian machines). – sf_admin Jan 03 '18 at 21:38
  • 1
    @sf_admin You are right, actually there's no `--include-dir` option. Was wondering when this was included, on my first version? Can't remember. Would be nice if it was possible to see the answer change log like git. – rakib_ Jan 04 '18 at 03:53
  • Use `s` for some amazing "ignore error and warnings" goodies! Great for big searches. – GigaBass Jun 07 '18 at 04:47
  • or more minimalistic, `grep -r 'rrr' ./`. (or as mentioned -R instead of -r, so as to include symbolic links) – barlop Mar 07 '19 at 02:07
  • One of the strings I was looking for contained special characters. You want to add a -F so it counts string literals. – Growling Flea Mar 14 '19 at 01:00
  • does not search half words – waza123 Aug 27 '19 at 08:50
  • 1
    @waza123 - Haha, yeah, to explicitly made it full word `w` is used. If you want half words (partial) just avoid `w`. – rakib_ Aug 27 '19 at 11:55
  • `grep -r text-to-find` also works good for current directory on Linux (most simple version). – korst1k Aug 27 '19 at 17:33
  • `grep --exclude-dir={'dir1','dir2','wildcard*'} -rnwl '/path/to/somewhere/' -e "pattern"` is good on OSX – charles.cc.hsu Nov 28 '19 at 04:46
  • I have tried and I do not get any filename, only content being displayed, any idea what I am doing wrong ? – Dimitri Kopriwa Mar 16 '20 at 09:28
  • @DimitriKopriwa try -l option to show file name as mentioned in the second most popular answer. – Gediminas Apr 17 '20 at 20:32
  • More on `-r` and `-R` in [this question and answer](https://stackoverflow.com/a/22763809/1028230). – ruffin Jun 29 '20 at 16:50
  • Thank you. Is there a way to include a wild card (`*`) in the pattern, so if I wanted all files containing a word starting with `beginning`, I could put `grep -rnw '/path/to/somewhere/' -e 'beginning*'`, but this doesn't work. – mikey Dec 18 '20 at 22:05
  • Is there a way to show only filenames. I meet a problem, if one file contains this text many times, then it will display in result many times – Ninja Jan 06 '21 at 03:22
  • Doesn't work for me. No matches. – Anton Kukoba Apr 02 '21 at 13:02
  • 1
    I would add `i` option for case insensitive search – David Okwii Apr 07 '21 at 21:07
  • I keep googling "find in files" knowing that I'll end up on this answer on stackoverflow... Every time I need it :-) – Axi Apr 09 '21 at 14:26
1707

You can use grep -ilR:

grep -Ril "text-to-find-here" /
  • i stands for ignore case (optional in your case).
  • R stands for recursive.
  • l stands for "show the file name, not the result itself".
  • / stands for starting at the root of your machine.
fedorqui 'SO stop harming'
  • 228,878
  • 81
  • 465
  • 523
  • How long on average (obviously depends greatly on the system) do you think this would take to scan the full system? Do you think using a regular expression with grep would make it go faster? – Nathan Jun 06 '13 at 08:12
  • 98
    Based on my experience, the `-i` makes it slow down a lot, so don't use it if not necessary. Test it in a certain dir and then generalise. It should be completed within few minutes. I think a regular expression would make it slower. But my comments are based on suppositions, I suggest you to test it with `time` in front of the line. – fedorqui 'SO stop harming' Jun 06 '13 at 08:14
  • 5
    Yes, `/*` stands for that. Anyway I just tested it and noticed that just `/` works. – fedorqui 'SO stop harming' Jun 06 '13 at 08:15
  • 1
    Okay, thanks. It seems like this is capturing a lot of files that I can't even open up in a text editor. Is it perhaps interpreting all formats at text, and so randomly finding results in binary "noise" in an executable file for instance? – Nathan Jun 06 '13 at 08:20
  • You can add this: `2>/dev/null | grep -v "Binary file"` – fedorqui 'SO stop harming' Jun 06 '13 at 08:28
  • 12
    If you are not searching using a regex you can use fgrep in place of grep on most systems. – markle976 Sep 28 '13 at 14:49
  • 10
    Yes @markle976, in fact from man grep: `fgrep is the same as grep -F -> Interpret PATTERN as a list of fixed strings`. – fedorqui 'SO stop harming' Sep 30 '13 at 08:23
  • 1
    your right, `-i` slows it down what seems 10x slower at least – wired00 Jan 04 '15 at 05:58
  • @wired00 For sure, with `-i` it is definitely slower, but it is complicated to know how much: if you run the test with and without, probably the results of the first are cached and used while running the second. Also, it is more "expensive" to check a long word (`aBcDEFghIj`) than a short one (`aBc`), because there are way more possible combinations of upper/lowercase. – fedorqui 'SO stop harming' Jan 05 '15 at 09:13
  • @fedorqui Ahh yep understood, actaully what I've been using though now is the `ack 'search text'` command, it works really nicely for what I needed it for and is very fast. Its nice because of the nice highlighting it uses. – wired00 Jan 05 '15 at 21:16
  • 22
    You can replace / with path to directory `grep -Ril "text-to-find-here" ~/sites/` or use . for current directory `grep -Ril "text-to-find-here" .` – Black Jan 28 '16 at 12:19
  • 3
    @Nathan after all this time (and more than one million views!) I noticed you can add the parameter `-I` (capital i) to exclude binary files. I think this was the key point here, instead of `exclude` or `include`. Better late than never :D – fedorqui 'SO stop harming' Mar 24 '16 at 23:47
  • @fedorqui Thanks! And I know, 1.5 million views.. pretty crazy, right? I appreciate the answer you gave 2 years ago :) – Nathan Mar 29 '16 at 22:11
  • 1
    @Nathan very crazy!! Funny thing is that this became a canonical Q&A on looking for a string, even though the real problem was about excluding binary files : ) – fedorqui 'SO stop harming' Jun 02 '16 at 08:52
  • `grep -rl 'pattern' .` Recursive works with lowercase `-r` as well. Search Scope dot ` . ` for current directory is more intutive AFAIK rathan than ` / ` – nitinr708 Jul 08 '16 at 09:43
  • @nitinr708i `-R` and `-r` are different. From `man grep`: _Read all files under each directory, recursively. Follow all links, unlike -r._ – fedorqui 'SO stop harming' Jul 08 '16 at 09:45
  • Is it possible to display also the line number in the file where the string appears? – W.M. Dec 16 '16 at 14:43
  • 1
    @W.M. yes, just use `-n`: `grep -n "pattern" file`. – fedorqui 'SO stop harming' Dec 16 '16 at 14:44
  • 1
    -i very handy especially when browsing through code someone else made. Some languages like fortran are case insensitive when e.g. when declaring functions: FUNCTION/function dummy(x).. – Communisty Jun 30 '17 at 07:56
  • @fedorqui: I am surprised to hear that `grep` slows down on `-i` and long words, as I should have thought that the [Boyer-Moore](https://en.wikipedia.org/wiki/Boyer-Moore_string_search_algorithm) “generated-skip-length-table-driven” string search algorithm would be just as fast on `-i` and faster on longer words — presumably Linux `grep` is using something simpler! But https://en.wikipedia.org/wiki/Boyer%E2%80%93Moore_string_search_algorithm#Implementations says it is used! – PJTraill Nov 01 '17 at 21:18
  • @PJTraill I don't know the specifics. Common sense would say that looking for "abc" is easier than "Abc", "aBc", "abC" and so on, and timing examples showed me a big difference. But I am sure you know more about it than me! Maybe worth a question? – fedorqui 'SO stop harming' Nov 10 '17 at 14:06
  • How can I modify this command to look only in specific file types, e.g. in `.py` and `.txt` files? – Cleb Nov 28 '17 at 10:13
  • 2
    @Cleb see [grep, but only certain file extensions](https://stackoverflow.com/q/12516937/1983854) – fedorqui 'SO stop harming' Nov 28 '17 at 11:46
  • @fedorqui Could we get an example of how to apply this to only specific files? For example, find all instances of filename.txt which contain 'x' text (recursive)? Thanks" – omega1 Jan 21 '18 at 11:46
  • 1
    @omega1 something like `find -name filename.txt -exec grep 'x' {} \;` – fedorqui 'SO stop harming' Jan 22 '18 at 07:36
  • how do I get rid of the warnings? I only want the files names – matias May 29 '18 at 15:32
  • @matias see [my comment](https://stackoverflow.com/questions/16956810/how-do-i-find-all-files-containing-specific-text-on-linux/16956844#comment24487774_16956844) from a while ago. Also consider using `-I` for binary files. – fedorqui 'SO stop harming' May 30 '18 at 10:22
  • 1
    very useful much more util – Darlan D. Jun 16 '19 at 04:22
  • 1
    freaking life saver!! What takes a solid 60 seconds in notepad ++ takes about 2 seconds using this command. I was missing my life :O – Gogol Sep 25 '19 at 12:58
  • 1
    "_You can use `grep -ilR`_" then immediately proceeds to rearrange options to less ill `grep -Ril "text-to-find-here" /`. ;^D I've become partial to "**grep in real life**" (`-iRl`) now myself. – ruffin Jun 29 '20 at 16:53
  • How to avoid searching subdirectories? If `-R` is omitted like this `grep -il 'searchtext' /path/to/dir/` it says `grep: /path/to/dir/: Is a directory`. `-R` is mandatory? – sjd Jul 14 '20 at 07:48
  • @sjd not at all, you can omit `-R`. Those messages go through stderr, so you can hide them by saying `grep -Ril '...' 2>/dev/null` – fedorqui 'SO stop harming' Jul 14 '20 at 07:53
  • 1
    @fedorqui'SOstopharming' Thanks for the quick response. But this seems working. to avoid subdirectories `grep -nl "text" /search/path/*` . Remove `r` and use `*` and it searched only in the given dir – sjd Jul 14 '20 at 11:48
  • @sjd the only problem with that approach is that it will exclude the hidden files. You can use `grep -nl "text" /search/path/{*,.*}` to include them, although it will produce some directory errors when trying to grep in `.` and `..`. – fedorqui 'SO stop harming' Jul 14 '20 at 11:59
  • 1
    Unlike the best result, this one works – Anton Kukoba Apr 02 '21 at 13:02
371

You can use ack. It is like grep for source code. You can scan your entire file system with it.

Just do:

ack 'text-to-find-here'

In your root directory.

You can also use regular expressions, specify the filetype, etc.


UPDATE

I just discovered The Silver Searcher, which is like ack but 3-5x faster than it and even ignores patterns from a .gitignore file.

RAJ
  • 9,353
  • 1
  • 29
  • 61
Stephan
  • 12,870
  • 6
  • 46
  • 55
  • 64
    Very useful, simple and fast. Warning: "On Debian-derived distros, ack is packaged as "ack-grep" because "ack" already existed" (from http://beyondgrep.com/install/). You may end up running a Kanji code converter on those Linuxes... – Jose_GD Sep 20 '13 at 13:32
  • 11
    ack or ack-grep has nice highlights, but find+grep when proper used is much better in performance – Sławomir Lenart Feb 11 '15 at 09:00
  • 23
    Note that [ripgrep](https://github.com/BurntSushi/ripgrep) is faster than anything else mentioned here, including The Silver Searcher and plain 'ol grep. See [this blog post](http://blog.burntsushi.net/ripgrep/) for proof. – Radon Rosborough Oct 14 '17 at 04:01
229

You can use:

grep -r "string to be searched"  /path/to/dir

The r stands for recursive and so will search in the path specified and also its sub-directories. This will tell you the file name as well as print out the line in the file where the string appears.

Or a command similar to the one you are trying (example: ) for searching in all javascript files (*.js):

find . -name '*.js' -exec grep -i 'string to search for' {} \; -print

This will print the lines in the files where the text appears, but it does not print the file name.

In addition to this command, we can write this too: grep -rn "String to search" /path/to/directory/or/file -r: recursive search n: line number will be shown for matches

learner_19
  • 3,211
  • 1
  • 16
  • 8
  • 1
    Thanx for the find version. My grep version (busybox for NAS) hasn't the -r option, i really needed another solution! – j.c Sep 02 '16 at 10:34
  • 3
    Thank you for the 'find' version! It is so important to be able to filter by '*.js' or '*.txt', etc. Nobody wants to spend hours waiting for grep to finish searching all the multi-gigabyte videos from the last family vacation, even if the command is easier to type. – mightypile Aug 16 '17 at 15:10
  • better grep than accepted version, because accepted do not search half words – waza123 Aug 27 '19 at 08:53
128

You can use this:

grep -inr "Text" folder/to/be/searched/
Sudipta
  • 4,227
  • 2
  • 22
  • 41
A R
  • 2,217
  • 3
  • 16
  • 34
94

grep (GNU or BSD)

You can use grep tool to search recursively the current folder, like:

grep -r "class foo" .

Note: -r - Recursively search subdirectories.

You can also use globbing syntax to search within specific files such as:

grep "class foo" **/*.c

Note: By using globbing option (**), it scans all the files recursively with specific extension or pattern. To enable this syntax, run: shopt -s globstar. You may also use **/*.* for all files (excluding hidden and without extension) or any other pattern.

If you've the error that your argument is too long, consider narrowing down your search, or use find syntax instead such as:

find . -name "*.php" -execdir grep -nH --color=auto foo {} ';'

Alternatively, use ripgrep.

ripgrep

If you're working on larger projects or big files, you should use ripgrep instead, like:

rg "class foo" .

Checkout the docs, installation steps or source code on the GitHub project page.

It's much quicker than any other tool like GNU/BSD grep, ucg, ag, sift, ack, pt or similar, since it is built on top of Rust's regex engine which uses finite automata, SIMD and aggressive literal optimizations to make searching very fast.

It supports ignore patterns specified in .gitignore files, so a single file path can be matched against multiple glob patterns simultaneously.


You can use common parameters such as:

  • -i - Insensitive searching.
  • -I - Ignore the binary files.
  • -w - Search for the whole words (in the opposite of partial word matching).
  • -n - Show the line of your match.
  • -C/--context (e.g. -C5) - Increases context, so you see the surrounding code.
  • --color=auto - Mark up the matching text.
  • -H - Displays filename where the text is found.
  • -c - Displays count of matching lines. Can be combined with -H.
kenorb
  • 118,428
  • 63
  • 588
  • 624
  • 2
    I also find extended globbing useful. But keep in mind that if there are really huge number of files, you can get a "Argument list too long" error. (Simple globbing is also prone to this kind of error). – Yoory N. Nov 30 '17 at 06:47
  • 7
    For inhaling a whole file system, rg is gonna be far less painful than almost any other tool. – l.k Apr 23 '19 at 06:11
  • lol i'm sorry I downvoted this answer yesterday by mistake and now I can't change it ;_; here's 10 robot parts for apologies <3 – Michael Villeneuve Jun 17 '20 at 19:47
  • it could use a better name, though... i don't use it everyday and it's get hard to remember this name when i want to use it! – cregox Aug 28 '20 at 12:18
86

List of file names containing a given text

First of all, I believe you have used -H instead of -l. Also you can try adding the text inside quotes followed by {} \.

find / -type f -exec grep -l "text-to-find-here" {} \; 

Example

Let's say you are searching for files containing specific text "Apache License" inside your directory. It will display results somewhat similar to below (output will be different based on your directory content).

bash-4.1$ find . -type f -exec grep -l "Apache License" {} \; 
./net/java/jvnet-parent/5/jvnet-parent-5.pom
./commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.pom
./io/swagger/swagger-project/1.5.10/swagger-project-1.5.10.pom
./io/netty/netty-transport/4.1.7.Final/netty-transport-4.1.7.Final.pom
./commons-codec/commons-codec/1.9/commons-codec-1.9.pom
./commons-io/commons-io/2.4/commons-io-2.4.pom
bash-4.1$ 

Remove case sensitiveness

Even if you are not use about the case like "text" vs "TEXT", you can use the -i switch to ignore case. You can read further details here.

Hope this helps you.

lkamal
  • 3,160
  • 14
  • 28
  • 2
    Which is what this command does: `find` will pass all the paths it finds to the command `grep -l "text-to-find-here" "`. You may add restrictions to the file name, e.g. `find / -iname "*.txt"` to search only in files which name ends in `.txt` – Mene Apr 20 '17 at 13:46
  • 1
    @Auxiliary - included a sample output to avoid any confusion for the readers. – lkamal Oct 07 '17 at 05:56
  • 2
    @Mene It's a truly sad state that Auxiliary's comment has more votes than yours...even if their comment is from 2014 and yours is 2017 that their comment has 6 when it should have exactly 0 and yours only had one (now two) isn't something I'd like to believe. – Pryftan May 01 '18 at 23:01
  • @Mene That being said *`-iname`* is case-insensitive which means it would also find .TXT files, for example, as well as TxT and TXt and so on. – Pryftan May 01 '18 at 23:04
60

If your grep doesn't support recursive search, you can combine find with xargs:

find / -type f | xargs grep 'text-to-find-here'

I find this easier to remember than the format for find -exec.

This will output the filename and the content of the matched line, e.g.

/home/rob/file:text-to-find-here

Optional flags you may want to add to grep:

  • -i - case insensitive search
  • -l - only output the filename where the match was found
  • -h - only output the line which matched (not the filename)
RobEarl
  • 7,556
  • 6
  • 31
  • 48
  • 3
    This is equivalent to `grep 'text-to-find-here'` without file name if `find` does not find anything. This will hang and wait for user input! Add `--no-run-if-empty` as an option to `xargs`. – hagello Jan 28 '16 at 05:46
  • 4
    This combination of find and xargs does not work as intended if file or directory names contain spaces (characters that xargs interprets as separators). Use `find … -exec grep … +`. If you insist on using find together with xargs, use `-print0` and `-0`. – hagello Jan 28 '16 at 05:50
48

There's a new utility called The Silversearcher

sudo apt install silversearcher-ag

It works closely with Git and other VCS. So you won't get anything in a .git or another directory.

You can simply use

ag "Search query"

And it will do the task for you!

Neil Agarwal
  • 732
  • 1
  • 7
  • 9
  • Good call!. I downloaded it and used it first time. The output results are very informative and colourful and very helpfull. This prog will stay in my machine for ever. I have also put it on my "Install a new computer" list of programs. Cheers!! – joe_evans Jul 06 '20 at 15:01
47
grep -insr "pattern" *
  • i: Ignore case distinctions in both the PATTERN and the input files.
  • n: Prefix each line of output with the 1-based line number within its input file.
  • s: Suppress error messages about nonexistent or unreadable files.
  • r: Read all files under each directory, recursively.
Fabio Poloni
  • 7,573
  • 3
  • 39
  • 72
enfinet
  • 700
  • 6
  • 16
  • 3
    Can you explain how your answer improves upon the other answers, or how it is sufficiently different from them? – Amos M. Carpenter Feb 26 '16 at 06:10
  • not much complex to remember, will cover all patterns(case-senstivity -> off, includes file-names and line number and will do recursively search etc) and using "*" in the end will search all directories (no need to specify any path or directory name). – enfinet Feb 26 '16 at 06:15
  • Sorry, I should've been clearer: it would be great if you could include that explanation in your answer. As it stands, especially with so many other similar answers already, it is hard to see from such a short answer what the benefit of trying _it_ over the accepted answer or one of the upvoted ones would be. – Amos M. Carpenter Feb 26 '16 at 06:35
  • 6
    @AmosM.Carpenter One thing I love about this answer is pointing out the suppress argument, which can help filter out noise that doesn't matter to getting the results we actually want. Grep prints errors like, "Function not implemented", "Invalid Argument", "Resource unavailable", etc. etc on certain "files". – leetNightshade Feb 20 '17 at 05:58
  • @leetNightshade: I'm assuming you're addressing your comment to me because I asked for an explanation on the sparse original post. Please see Fabio's great [revision](http://stackoverflow.com/posts/35644413/revisions) for my previous comments to make sense. – Amos M. Carpenter Feb 20 '17 at 11:59
  • I'm quite fond of ` -I Ignore binary files.`. – Bruno Bronosky Feb 08 '19 at 08:29
43

How do I find all files containing specific text on Linux? (...)

I came across this solution twice:

find / -type f -exec grep -H 'text-to-find-here' {} \;


If using find like in your example, better add -s (--no-messages) to grep, and 2>/dev/null at the end of the command to avoid lots of Permission denied messages issued by grep and find:

find / -type f -exec grep -sH 'text-to-find-here' {} \; 2>/dev/null

find is the standard tool for searching files - combined with grep when looking for specific text - on Unix-like platforms. The find command is often combined with xargs, by the way.

Faster and easier tools exist for the same purpose - see below. Better try them, provided they're available on your platform, of course:

Faster and easier alternatives

RipGrep - fastest search tool around:

rg 'text-to-find-here' / -l

The Silver Searcher:

ag 'text-to-find-here' / -l

ack:

ack 'text-to-find-here' / -l

Note: You can add 2>/dev/null to these commands as well, to hide many error messages.


Warning: unless you really can't avoid it, don't search from '/' (the root directory) to avoid a long and inefficient search! So in the examples above, you'd better replace '/' by a sub-directory name, e.g. "/home" depending where you actually want to search...

Bludzee
  • 2,380
  • 5
  • 29
  • 38
  • *'find is the standard tool for searching files containing specific text on Unix-like platforms'* seems rather ambiguous to me. Even besides recursive `grep` `find` doesn't directly search the inside of files for text. And maybe those additional tools are useful to some but old timers and those whoa are well accustomed to e.g. *`grep`* wouldn't give them any time at all (well I certainly won't). Not saying they're useless though. – Pryftan May 01 '18 at 23:36
  • "....containing specific text..." : this part of the sentence was not accurate (because it's not find itself that deals with this part of the search). Edited. Thanks. – Bludzee Jun 01 '18 at 09:21
  • Glad to be of help! The only thing else at a very very quick glance is changing the word *folder* to *directory* but I know that’s a crusade of mine I will never win completely. Not giving up though... – Pryftan Jun 01 '18 at 16:13
  • Why not "directory" instead of "folder", but why ? Please share your "crusade" ! – Bludzee Jun 01 '18 at 17:00
  • I'm saying use directory instead! Referring to: **you'd better replace '/' by a sub-folder name** And it's a pet peeve of mine.. esp since even Windows used to call it 'directory'. Ah..maybe you got that. Why? Well because that's what it's called. It's also called that at the file system level. And look at it this way: was it ever called (for DOS) *`fol`*? No of course not; it was called *`dir`* (and I believe it still is). Folder is a thing contrived for (I guess) user friendliness though in this case it's maybe dumbing it down for less 'advanced' users? – Pryftan Jun 02 '18 at 00:15
  • Perhaps part of it is semantics and pedantry but as a programmer who also has a fascinating with language (not just computer languages) ... well it matters to me. And the functions also reference directories not folders (no idea in Windows though). – Pryftan Jun 02 '18 at 00:19
  • Good. Much better. :) I up-voted too. One other thought is you might want to say that if the regexp/string to be found includes a *`-`* you might be wise to first pass to grep *`--`*. I was also pretty sure that *`{}`* should be quoted or escaped but I could be remembering wrong. Also I find it instructive that you included alternative tools (not that I would ever use them but then I don't need help with this task in the question anyway). Of course depending on what's wanted you could just as well use recursion with *`grep`*. Just some additional thoughts for whatever they might be worth. – Pryftan Jun 02 '18 at 20:04
33

Try:

find . -name "*.txt" | xargs grep -i "text_pattern"
kenorb
  • 118,428
  • 63
  • 588
  • 624
venkat
  • 339
  • 3
  • 2
  • 5
    This is actually a prime example of when NOT to use `xargs` like that .. consider this. ```echo "file bar.txt has bar" > bar.txt; echo "file foo bar.txt has foo bar" > "foo bar.txt"; echo "You should never see this foo" > foo; find . -name "*.txt" | xargs grep -i foo # ./foo:You should never see this foo``` . The `xargs` here matched the WRONG file and did NOT match the intended file. Either use a `find .. -print0 | xargs -0 ...` but that's a useless use of a pipe or better `find ... -exec grep ... {} +` – shalomb Oct 11 '16 at 20:10
31

Use pwd to search from any directory you are in, recursing downward

grep -rnw `pwd` -e "pattern"

Update Depending on the version of grep you are using, you can omit pwd. On newer versions . seems to be the default case for grep if no directory is given thus:

grep -rnw -e "pattern"

or

grep -rnw "pattern"

will do the same thing as above!

mahatmanich
  • 9,453
  • 5
  • 51
  • 72
  • 4
    using `pwd` is not necessary at all, since it is the default. `grep -rnw "pattern"` suffices. – fedorqui 'SO stop harming' Dec 02 '16 at 13:17
  • and in fact the `grep -rnw` and similar is what was answered like three years ago, I don't see how this answer is adding value. – fedorqui 'SO stop harming' Dec 02 '16 at 14:03
  • The selected answer does not show the default pattern, and 5 peoples seemed to have found it useful – mahatmanich Dec 14 '16 at 08:27
  • What do you mean with "default pattern"? The accepted answer contains `grep -rnw '/path/to/somewhere/' -e "pattern"` which is what you have here. 5 votes after 2.3M visits does not mean that much. – fedorqui 'SO stop harming' Dec 14 '16 at 08:45
  • I agree :-) what I was missing in the original answer is the use case that you don't have to give a path at all or to search the current directory recursively which is not reflected in the accepted answer. Thus it was a good learning experience about grep to dig a bit deeper. – mahatmanich Dec 14 '16 at 14:05
  • OK, I see. It is not trivial to see that `grep "pattern"` needs something to check (a dir, a file...) while `grep -R "pattern"` works standalone. Then probably an addition to the answer would benefit more people (last posts are rarely noticed). But I am glad you learned from it :) I have the 2nd most upvoted answer and I see there are many, many variants of doing this. – fedorqui 'SO stop harming' Dec 14 '16 at 14:39
  • So with `pwd` I was trying to find an easy hack to not type in the full path, but I am sure `.` would suffice as well, as current directory, but leaving it out altogether is of course the leanest. – mahatmanich Dec 14 '16 at 15:07
  • Actually I just now am running into issues where, if the path is not given, there is no output. Thus `pwd` is needed @fedorqui – mahatmanich Feb 02 '17 at 10:09
  • `cd /tmp; mkdir mytest; cd mytest; mkdir a{1..3}; seq 10 > a1/a1; seq 10 > a1/a2; seq 10 > a2/a1` once you have all of this, do write `grep -rnw 5`. This works fine to me on GNU grep 2.16. – fedorqui 'SO stop harming' Feb 02 '17 at 10:13
  • This is not working on grep 2.5.1 which I am currently working with ... – mahatmanich Feb 02 '17 at 10:38
22

grep can be used even if we're not looking for a string.

Simply running,

grep -RIl "" .

will print out the path to all text files, i.e. those containing only printable characters.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Alex Jasmin
  • 37,102
  • 6
  • 72
  • 63
21

Silver Searcher is a terrific tool, but ripgrep may be even better.

It works on Linux, Mac and Windows, and was written up on Hacker News a couple of months ago (this has a link to Andrew Gallant's Blog which has a GitHub link):

Ripgrep – A new command line search tool

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
AAAfarmclub
  • 1,758
  • 15
  • 13
21

If you strictly want to use find then use find + grep:

find /path/to/somewhere/ -type f -exec grep -nw 'textPattern' {} \;

Steps:

  1. Use find to search files,
  2. Execute grep on all of them.

This gives you the power of find to find files.

  • Use -name Pattern if you want to grep only certain files:

find /path/to/somewhere/ -type f -name \*.cpp -exec grep -nw 'textPattern' {} \;

You can use different options of find to improve your file search.

JMP
  • 2,299
  • 17
  • 26
  • 34
BreakBadSP
  • 699
  • 8
  • 20
20

Here are the several list of commands that can be used to search file.

grep "text string to search” directory-path

grep [option] "text string to search” directory-path

grep -r "text string to search” directory-path

grep -r -H "text string to search” directory-path

egrep -R "word-1|word-2” directory-path

egrep -w -R "word-1|word-2” directory-path
Atul Arvind
  • 13,690
  • 5
  • 43
  • 54
18
find /path -type f -exec grep -l "string" {} \;

Explanation from comments

find is a command that lets you find files and other objects like directories and links in subdirectories of a given path. If you don't specify a mask that filesnames should meet, it enumerates all directory objects.

-type f specifies that it should proceed only files, not directories etc.
-exec grep specifies that for every found file, it should run grep command, passing its filename as an argument to it, by replacing {} with the filename
JuanZe
  • 7,603
  • 41
  • 57
Vinod Joshi
  • 7,094
  • 46
  • 49
18

Hope this is of assistance...

Expanding the grep a bit to give more information in the output, for example, to get the line number in the file where the text is can be done as follows:

find . -type f -name "*.*" -print0 | xargs --null grep --with-filename --line-number --no-messages --color --ignore-case "searthtext"

And if you have an idea what the file type is you can narrow your search down by specifying file type extensions to search for, in this case .pas OR .dfm files:

find . -type f \( -name "*.pas" -o -name "*.dfm" \) -print0 | xargs --null grep --with-filename --line-number --no-messages --color --ignore-case "searchtext"

Short explanation of the options:

  1. . in the find specifies from the current directory.
  2. -name "*.*" : for all files ( -name "*.pas" -o -name "*.dfm" ) : Only the *.pas OR *.dfm files, OR specified with -o
  3. -type f specifies that you are looking for files
  4. -print0 and --null on the other side of the | (pipe) are the crucial ones, passing the filename from the find to the grep embedded in the xargs, allowing for the passing of filenames WITH spaces in the filenames, allowing grep to treat the path and filename as one string, and not break it up on each space.
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Gert van Biljon
  • 321
  • 2
  • 6
  • `-name '*.*'` isn't what you say; it wouldn't pick up on a file called 'file' because the pattern doesn't equate to that (no .ext); `*` would however (well . files aside). But there's another thing: if you want all files why bother specifying a file name in the first place? No other comment - except that it's nice to know that there still are people who don't use the MS terminology 'folder' (which really after saying it enough I wouldn't add but I wanted to point out the slightly incorrect statement you made with file names - as well as the redundancy/uselessness in the case of 'all'). – Pryftan May 01 '18 at 23:25
16

Try:

find / -type f -exec grep -H 'text-to-find-here' {} \;

which will search all file systems, because / is the root folder.

For home folder use:

find ~/ -type f -exec grep -H 'text-to-find-here' {} \;

For current folder use:

find ./ -type f -exec grep -H 'text-to-find-here' {} \;
kenorb
  • 118,428
  • 63
  • 588
  • 624
user4863663
  • 179
  • 1
  • 2
  • Perhaps the details on differences of folders are obvious to many ...but also very helpful for newbies. +1 – nilon Oct 17 '16 at 18:07
  • 1
    what is this adding to the existing answers? – fedorqui 'SO stop harming' Dec 02 '16 at 13:16
  • Call it my crusade but the word is 'directory'. This isn't Windows (which used to use 'directory' anyway - pre 9x). Please stop saying 'folder'. As for your last command you don't even need the '/' just FYI. – Pryftan May 01 '18 at 23:12
16

A Simple find can work handy. alias it in your ~/.bashrc file:

alias ffind find / -type f | xargs grep

Start a new terminal and issue:

ffind 'text-to-find-here'
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
danglingpointer
  • 4,071
  • 3
  • 20
  • 36
16

I am fascinated by how simple grep makes it with 'rl':

grep -rl 'pattern_to_find' /path/where/to/find

-r to recursively find a file / directory inside directories..
-l to list files matching the 'pattern'

Use '-r' without 'l' to see the file names followed by text in which the pattern is found!

grep -r 'pattern_to_find' /path/where/to/find

It works just perfect...

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
nitinr708
  • 1,245
  • 2
  • 19
  • 27
15

I wrote a Python script which does something similar. This is how one should use this script.

./sniff.py path pattern_to_search [file_pattern]

The first argument, path, is the directory in which we will search recursively. The second argument, pattern_to_search, is a regular expression which we want to search in a file. We use the regular expression format defined in the Python re library. In this script, the . also matches newline.

The third argument, file_pattern, is optional. This is another regular expression which works on a filename. Only those files which matches this regular expression will be considered.

For example, if I want to search Python files with the extension py containing Pool( followed by word Adaptor, I do the following,

./sniff.py . "Pool(.*?Adaptor"  .*py
./Demos/snippets/cubeMeshSigNeur.py:146 
./Demos/snippets/testSigNeur.py:259 
./python/moose/multiscale/core/mumbl.py:206 
./Demos/snippets/multiComptSigNeur.py:268 

And voila, it generates the path of matched files and line number at which the match was found. If more than one match was found, then each line number will be appended to the filename.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Dilawar
  • 4,808
  • 9
  • 37
  • 54
15

grep is your good friend to achieve this.

grep -r <text_fo_find> <directory>

If you don't care about the case of the text to find, then use:

grep -ir <text_to_find> <directory>
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Prash
  • 458
  • 4
  • 8
  • In my case it looks like it searches everywhere even if I do specify the directory – Pathros Mar 20 '18 at 16:30
  • @Pathros Probably to do with recursion enabled and what directory you specify. Put another way recursion does change things in that way. – Pryftan May 01 '18 at 23:38
  • @Pathros Oh and if there are any *`-`* s in the search string you'll want to pass in *`--`* to grep first; that can cause interesting side effects otherwise! – Pryftan Jun 02 '18 at 00:25
14

There is an ack tool that would do exactly what you are looking for.

http://linux.die.net/man/1/ack

ack -i search_string folder_path/*

You may ignore -i for case sensitive search

Daniel
  • 3,458
  • 2
  • 32
  • 41
Pal
  • 787
  • 8
  • 20
  • 2
    What is this adding to the existing answers? This was suggested more than three years ago already. – fedorqui 'SO stop harming' Dec 02 '16 at 13:20
  • 1
    @fedorqui 1)no piping! 2)Use regular expressions 3)Get line numbers, file name with relative path, highlighted text etc. useful for editing after the search e.g "vim +lineno path/file.cpp" will get you right at the line no of interest. See the output of the command "ack include\|hpp" that searches "include" or "hpp" keywords under my search folder and subfolders. I hope the point is clear. Here is the sample output(Can't show the keyword highlights with simple text) process/child.hpp 11:boost/process/child.hpp process/all.hpp 21:#include – Pal Jul 11 '17 at 15:57
14

Use:

grep -c Your_Pattern *

This will report how many copies of your pattern are there in each of the files in the current directory.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Dr_Hope
  • 1,475
  • 14
  • 19
13

To search for the string and output just that line with the search string:

for i in $(find /path/of/target/directory -type f); do grep -i "the string to look for" "$i"; done

e.g.:

for i in $(find /usr/share/applications -type f); \
do grep -i "web browser" "$i"; done

To display filename containing the search string:

for i in $(find /path/of/target/directory -type f); do if grep -i "the string to look for" "$i" > /dev/null; then echo "$i"; fi; done;

e.g.:

for i in $(find /usr/share/applications -type f); \
do if grep -i "web browser" "$i" > /dev/null; then echo "$i"; \
fi; done;
  • 1
    I see only downside compared to using `find … -exec grep 'str' {} \;` (if you have to use `find` at all). – phk Oct 07 '16 at 16:14
  • 1
    This would break horribly if any of the files found by `find` contained spaces .. you could end up `grepping` the wrong files and/or missing the right files altogether. Just use `find ... -exec grep ...` if you have a need to use `find` .. but in this case a `grep -r ...` suffices. – shalomb Oct 11 '16 at 20:19
  • 1
    what is the point of using a loop over the results of find to then grep? This gets unnecessarily complicated. – fedorqui 'SO stop harming' Dec 02 '16 at 13:17
13

All previous answers suggest grep and find. But there is another way: Use Midnight Commander

It is a free utility (30 years old, proven by time) which is visual without being GUI. It has tons of functions, and finding files is just one of them.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
12

You can use below command as you don't want file name but you want to search from all the files. Here are i am capturing "TEXT" form All the log files making sure that file name is not printed

grep -e TEXT *.log | cut -d' ' --complement -s -f1

grep with -e option is quite quick compared to other option as it is for PATTERN match

Mitul Patel
  • 189
  • 2
  • 5
  • Personally I think you should remove the *`#`* because other than comments that typically implies something - and you shouldn't be root unless you absolutely have to be. Even so you needn't have the prompt surely? Call this petty but I have seen people many times over the years simply copy and paste and do things without truly understanding it. Not saying any will here but still.. Just a thought. – Pryftan Jun 02 '18 at 01:30
  • Better way use find + grep https://stackoverflow.com/a/51023211/7918560 – BreakBadSP Dec 19 '18 at 05:55
12

The below command will work fine for this approach:

find ./ -name "file_pattern_name"  -exec grep -r "pattern" {} \;
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Pradeep Goswami
  • 1,326
  • 1
  • 13
  • 21
11

Try this:

find . | xargs grep 'word' -sl
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Tayab Hussain
  • 655
  • 8
  • 15
  • 4
    this is far slower than the grep solution – amine Dec 22 '14 at 16:58
  • @amine Yeah rather than using *`grep`* directly it pipes all the files *`find`* finds to xargs running *`grep`* on it. I'm sure you understand that but just to add to those who might not. The command here is .. I can't atm think of a good analogy but it's adding a lot of unnecessary and harmless overhead. – Pryftan Jun 02 '18 at 01:34
11

Avoid the hassle and install ack-grep. It eliminates a lot of permission and quotation issues.

apt-get install ack-grep

Then go to the directory you want to search and run the command below

cd /
ack-grep "find my keyword"
Kareem
  • 4,110
  • 37
  • 34
10

Try this:

find / -type f -name "*" -exec grep -il "String_to_search" {} \;

Or

for i in /*;do grep -Ril "String_to_search" $i;done 2> /dev/null
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
VIPIN KUMAR
  • 2,692
  • 1
  • 14
  • 28
  • what is this adding to the existing answers? – fedorqui 'SO stop harming' Dec 02 '16 at 13:19
  • Good Question - Let start with top answers to this question. I tried below commands on AIX server with more than 15k files in log dir. grep -rnw '/path/to/somewhere/' -e "pattern" >>> got the error "/usr/bin/grep: 0403-027 The parameter list is too long." grep -Ril "text-to-find-here" / >>> got the error "/usr/bin/grep: 0403-027 The parameter list is too long." ack 'text-to-find-here' >>> got the error "Segmentation fault(coredump)" – VIPIN KUMAR Dec 02 '16 at 17:07
  • find / -type f -name "*" -exec grep -il "String_to_search" {} \; >>> It will produce the result with filename and file data. find / -type f -exec grep -H 'text-to-find-here' {} \; >>> It will produce the result with filename only. for i in /*;do grep -Ril "String_to_search" $i;done 2> /dev/null >>> It will work like grep -Ril "text-to-find-here" / but support large number of file. – VIPIN KUMAR Dec 02 '16 at 17:07
  • @VIPINKUMAR **The parameter list is too long.** Yeah that's what *`xargs`* is for. Unsure on AIX if it has that though; no comment on your actual commands. – Pryftan Jun 02 '18 at 01:36
9

Use:

grep -Erni + "text you wanna search"

The command will search recursively in all files and directories of the current directory and print the result.

Note: if your grep output isn't colored, you can change it by using the grep='grep --color=always' alias in your shell source file.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
baldash
  • 147
  • 2
  • 14
  • You might want to point out that *`-i`* makes the search case-insensitive; by default it doesn't have that - nor should it as Unix (etc.) isn't a case-insensitive OS. You might also want to specify what the other options are for too. – Pryftan Jun 02 '18 at 01:31
8

If you have a set of files that you will always be checking you can alias their paths, for example:

alias fd='find . -type f -regex ".*\.\(inc\|info\|module\|php\|test\|install\|uninstall\)"'

Then you can simply filter the list like this:

grep -U -l $'\015' $(fd)

Which filters out the list fd to files that contain the CR pattern.

I find that aliasing the files that I am interested in helps me create easier scripts then always trying to remember how to get all those files. The recursive stuff works as well but sooner or later you are going to have to contend with weeding out specific file types. Which is is why I just find all the file types I'm interested in to begin with.

dkinzer
  • 28,835
  • 12
  • 63
  • 79
8

You can use the following commands to find particular text from a file:

cat file | grep 'abc' | cut -d':' -f2
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
iamjayp
  • 366
  • 4
  • 15
8

find with xargs is preferred when there are many potential matches to sift through. It runs more slowly than other options, but it always works. As some have discovered,xargs does not handle files with embedded spaces by default. You can overcome this by specifying the -d option.

Here is @RobEarl's answer, enhanced so it handles files with spaces:

find / -type f | xargs -d '\n' grep 'text-to-find-here'

Here is @venkat's answer, similarly enhanced:

find . -name "*.txt" | xargs -d '\n' grep -i "text_pattern"

Here is @Gert van Biljon's answer, similarly enhanced:

find . -type f -name "*.*" -print0 | xargs -d '\n' --null grep --with-filename --line-number --no-messages --color --ignore-case "searthtext"

Here is @LetalProgrammer's answer, similarly enhanced:

alias ffind find / -type f | xargs -d '\n' grep

Here is @Tayab Hussain's answer, similarly enhanced:

find . | xargs -d '\n' grep 'word' -sl
Mike Slinn
  • 6,212
  • 5
  • 38
  • 69
  • [So `grep -rl`](https://stackoverflow.com/questions/16956810/how-do-i-find-all-files-containing-specific-text-on-linux/45564790#45564790) doesn't work with many matches? – Peter Mortensen Apr 24 '19 at 16:19
  • "under many other Unix-like systems, arbitrarily long lists of parameters cannot be passed to a command, so the command may fail with an error message of "Argument list too long" (meaning that the exec system call's limit on the length of a command line was exceeded)" ... https://en.wikipedia.org/wiki/Xargs – Mike Slinn Apr 24 '19 at 20:03
7

Try this

find . -type f -name some_file_name.xml -exec grep -H PUT_YOUR_STRING_HERE {} \;
Sireesh Yarlagadda
  • 10,572
  • 2
  • 65
  • 71
  • 1
    This does not provide an answer to the question. To critique or request clarification from an author, leave a comment below their post. - [From Review](/review/low-quality-posts/10250356) – sergdenisov Nov 18 '15 at 20:14
  • 4
    @SergeyDenisov What gives? This is definitely an answer. (Whether it works or not is another matter.) – jpaugh Nov 18 '15 at 23:43
  • 1
    @jpaugh then you should explain it in details. – sergdenisov Nov 18 '15 at 23:55
  • 1
    @SergeyDenisov. It gives a suggested course of action that might produce the correct result. Or, even if it does not, it might help someone else. That's what I mean by, "It's an answer." If you want to know how it works, ask the poster. – jpaugh Nov 18 '15 at 23:58
  • @SireeshYarlagadda You should provide more information in your answer, especially since this command is relatively complicated. Break it down into parts and explain each one. (It was flagged as low quality because it lacked an explanation.) – jpaugh Nov 19 '15 at 00:00
  • 2
    @jpaugh I'm sure that one line command/code is not enough for a complete answer. You could write a comment giving a suggested course of action, but an answer should include an explanation. That's why this answer was flagged as "Low Quality Post" (not by me). – sergdenisov Nov 19 '15 at 09:47
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/95563/discussion-between-sergey-denisov-and-jpaugh). – sergdenisov Nov 19 '15 at 12:26
  • @SergeyDenisov I agree with you on that! But I did not understand what you meant from the "canned" comment in the review tools. – jpaugh Nov 19 '15 at 21:00
7

As Peter in the previous answer mentioned, all previous answers suggest grep and find.

But there is a more sophisticated way using Gnome Commander with a perfect GUI and with tons of options since 2001, and finding files is just one of them. It is a free utility as well, proven by time.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
Geeocode
  • 5,009
  • 2
  • 15
  • 30
7

See also The Platinium Searcher, which is similar to The Silver Searcher and it's written in Go.

Example:

pt -e 'text to search'
Gustavo Paulo
  • 236
  • 2
  • 3
  • 1
    A link to a solution is welcome, but please ensure your answer is useful without it: [add context around the link](//meta.stackexchange.com/a/8259) so your fellow users will have some idea what it is and why it’s there, then quote the most relevant part of the page you're linking to in case the target page is unavailable. [Answers that are little more than a link may be deleted.](//stackoverflow.com/help/deleted-answers) – Shree Apr 17 '18 at 03:26
6

GUI Search Alternative - For Desktop Use:

- As the question is not precisely asking for commands

Searchmonkey: Advanced file search tool without having to index your system using regular expressions. Graphical equivalent to find/grep. Available for Linux (Gnome/KDE/Java) and Windows (Java) - open source GPL v3

Features:

  • Advanced Regular Expressions
  • Results shown in-context
  • Search containing text
  • Panel to display line containing text
  • New 2018 updates
  • etc.

Download - Links:

.

Screen-shot:

Enter image description here

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
intika
  • 5,226
  • 4
  • 26
  • 47
5

I'm trying to find a way to scan my entire Linux system for all files containing a specific string of text. ... Is this close to the proper way to do it? If not, how should I? ... This ability to find text strings in files would be extraordinarily useful for some programming projects I'm doing.

While you should never replace (or alias) a system command with a different program, due to risk of mysterious breakage of scripts or other utilities, if you are running a text search manually or from your own scripts or programs you should consider the fastest suitable program when searching a large number of files a number of times. Ten minutes to half an hour time spent installing and familiarizing yourself with a better utility can be recovered after a few uses for the use-case you described.

A webpage offering a "Feature comparison of ack, ag, git-grep, GNU grep and ripgrep" can assist you to decide which program offers the features you need.

  • Andrew Gallant's Blog claims: "ripgrep is faster than {grep, ag, git grep, ucg, pt, sift}" (a claim shared by some of the others, this is why a feature comparison is helpful). Of particular interest is his section on regex implementations and pitfalls.

    The following command searches all files, including hidden and executable:

    $ rg -uuu foobar

  • The Silver Searcher (ag) claims it is 5-10x faster than Ack. This program is suggested in some other answers. The GitHub doesn't appear as recent as ripgrep's and there are noticably more commits and branches with fewer releases, it's hard to draw an absolute claim based on those stats. The short version: ripgrep is faster, but there's a tiny learning curve to not get caught by the differences.

  • So what could be next, you guessed it, the platinum searcher. The claims are: it searches code about 3–5× faster than ack, but its speed is equal to the silver searcher. It's written in GoLang and searches UTF-8, EUC-JP and Shift_JIS files; if that's of greater interest. The GitHub is neither particularly recent or active. GoLang itself has a fast and robust regex, but the platinum searcher would be better recommended if it had a better user interest.

For a combination of speed and power indexed query languages such as ElasticSearch or Solr can be a long term investment that pays off, but not if you want a quick and simple replacement for grep. OTOH both have an API which can be called from any program you write, adding powerful searches to your program.

While it's possible to spawn an external program, execute a search, intercept its output and process it, calling an API is the way to go for power and performance.

This question was protected Aug 6 '15 at 19:34 with this caution:
  We're looking for long answers that provide some explanation and context. Don't just give a one-line answer; explain why your answer is right, ideally with citations.

While some answers suggest alternative ways to accomplish a search they don't explain why other than it's "free", "faster", "more sophisticated", "tons of features", etc. Don't try to sell it, just tell us "why your answer is right". I've attempted to teach how to choose what's best for the user, and why. This is why I offer yet another answer, when there are already so many. Otherwise I'd agree that there are already quite a few answers; I hope I've brought a lot new to the table.

Community
  • 1
  • 1
Rob
  • 1,345
  • 2
  • 20
  • 21
2

Your command is correct. You just need to add -l to grep:

find / -type f -exec grep -l 'text-to-find-here' {} \;
Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
FariZ
  • 305
  • 3
  • 8
1

I tried the grep command below. It helps searching contents within my repository at /etc/yum.repos.d.

grep . -Ril -e 'texttoSearch' /etc/yum.repos.d

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
muhammad tayyab
  • 503
  • 4
  • 13
1

Try this command. Which will give you the files containing the pattern you entered.

sudo grep -inr "your-pattern" /

Here: i - Ignore case distinctions, so that characters that differ only in case match each other.

n - Make sure that the first character of actual line content lies on a tab stop, so that the alignment of tabs looks normal.

r - Read all files under each directory, recursively, following symbolic links only if they are on the command line. Note that if no file operand is given, grep searches the working directory.

1

You can use ripgrep which will respect by default project's .gitignore file

ripgrep

To suppress Permission denied errors

rg -i rustacean 2> /dev/null

which will redirect the stderr (standard error output) to /dev/null

Levon
  • 6,046
  • 2
  • 35
  • 36
1

My use case was to find Python code I had written way back that wrote jsonlines a particular way. I knew that jsonl would be part of the function name and to_json would appear in the body, but not much else.

Despite 50 answers, finding more than one string in the same file (whether or not in the same line) hasn't been answered. Hopefully someone else in the same situation finds this answer and can reuse this snippet.

The -q in grep is for quiet. Nothing is printed, only the return value is set. Thus the -print at the end. Each -exec only runs if the previous one succeeded. So if you have many files it pays to think about patterns that will eliminate files you aren't interested in.

find . -type f -name "*.py" \
  -exec grep -q -e 'to_json' {} \; \
  -exec grep -q -e 'def\s.*jsonl' {} \; \
  -print
Leo
  • 2,044
  • 19
  • 25
0

You can also use awk:

awk '/^(pattern)/{print}' /path/to/find/*

pattern is the string you want to match in the files.

Peter Mortensen
  • 28,342
  • 21
  • 95
  • 123
  • 1
    Does this traverse a directory tree, or only the files in the path specified without going into contained directories? – RufusVS Nov 14 '19 at 17:44
  • 1
    Doesn't work for me: `user@host:/dir$ awk '/^(getCookie)/{print}' . awk: warning: command line argument `.' is a directory: skipped` – CoderGuy123 Feb 20 '20 at 17:00
0
grep -lrnw '/root/Desktop/ipozal' -e 'geolocation'
Eyni Kave
  • 105
  • 8
0

Kindly customize below command according to demand and find any string recursively from files.

grep -i hack $(find /etc/ -type f)
linux.cnf
  • 67
  • 3
  • Customise what parts? Your answer needs more explanation (see @rakib_'s answer for an example) – Ashley Mills Sep 14 '20 at 10:31
  • customise means choose "hack"(searching string) keyword and /etc location according to your demand. you can also use below command grep -ErIRi hack /* – linux.cnf Sep 27 '20 at 17:07
0

If you are in a git repository you can use:

git grep something
Dorian
  • 1,551
  • 8
  • 23
-2

Find any files whose name is ".kube/config", and content include eks_use1d:

locate ".kube/config" | xargs -i sh -c 'echo \\n{};cat {} | grep eks_use1d'
BuZZ-dEE
  • 3,875
  • 7
  • 48
  • 76
John Zheng
  • 17
  • 2