Reading a plain text file in Java
Asked Answered
P

31

1062

It seems there are different ways to read and write data of files in Java.

I want to read ASCII data from a file. What are the possible ways and their differences?

Pooh answered 17/1, 2011 at 18:29 Comment(4)
I also disagree with closing as "not constructive". Fortunately, this could well be closed as duplicate. Good answers e.g. in How to create a String from the contents of a file?, What is simplest way to read a file into String?, What are the simplest classes for reading files?Strathspey
Without loops: {{{ Scanner sc = new Scanner(file, "UTF-8"); sc.useDelimiter("$^"); // regex matching nothing String text = sc.next(); sc.close(); }}}Gish
it's so interesting that there is nothing like "read()" in python , to read the whole file to a stringUnlimited
This is the simplest way to do this: mkyong.com/java/…Octopod
P
637

ASCII is a TEXT file so you would use Readers for reading. Java also supports reading from a binary file using InputStreams. If the files being read are huge then you would want to use a BufferedReader on top of a FileReader to improve read performance.

Go through this article on how to use a Reader

I'd also recommend you download and read this wonderful (yet free) book called Thinking In Java

In Java 7:

new String(Files.readAllBytes(...))

(docs) or

Files.readAllLines(...)

(docs)

In Java 8:

Files.lines(..).forEach(...)

(docs)

Pejoration answered 17/1, 2011 at 18:31 Comment(9)
Picking a Reader really depends on what you need the content of the file for. If the file is small(ish) and you need it all, it's faster (benchmarked by us: 1.8-2x) to just use a FileReader and read everything (or at least large enough chunks). If you're processing it line by line then go for the BufferedReader.Lorelle
I guess @Lorelle is referring to the method of reading entire file into memory explained here java2s.com/Tutorial/Java/0180__File/…Headwaiter
Will the line order be preserved when using "Files.lines(..).forEach(...)". My understanding is that the order will be arbitrary after this operation.Nadeen
Files.lines(…).forEach(…) does not preserve order of lines but is executed in parallel, @Dash. If the order is important, you can use Files.lines(…).forEachOrdered(…), which should preserve the order (did not verify though).Seemly
@Seemly this is interesting, but can you quote from the docs where it says that Files.lines(...).forEach(...) is executed in parallel? I thought this was only the case when you explicitly make the stream parallel using Files.lines(...).parallel().forEach(...).Statistical
My original formulation is not bulletproof, @KlitosKyriacou. The point is that forEach does not guarantee any order and the reason is easy parallelization. If order is to be preserved, use forEachOrdered.Seemly
@AravindR.Yarram by the way the link to the book "Thinking In Java" is no longer working!Hurley
Just to clarify the answer about the parallelization and process order. The Java doc says "The behavior of this operation is explicitly nondeterministic. For parallel stream pipelines, this operation does not guarantee to respect the encounter order of the stream, as doing so would sacrifice the benefit of parallelism. For any given element, the action may be performed at whatever time and in whatever thread the library chooses. If the action accesses shared state, it is responsible for providing the required synchronization."Salina
So my understanding is that the order is always the same unless parallelism is used.Salina
F
751

My favorite way to read a small file is to use a BufferedReader and a StringBuilder. It is very simple and to the point (though not particularly effective, but good enough for most cases):

BufferedReader br = new BufferedReader(new FileReader("file.txt"));
try {
    StringBuilder sb = new StringBuilder();
    String line = br.readLine();

    while (line != null) {
        sb.append(line);
        sb.append(System.lineSeparator());
        line = br.readLine();
    }
    String everything = sb.toString();
} finally {
    br.close();
}

Some has pointed out that after Java 7 you should use try-with-resources (i.e. auto close) features:

try(BufferedReader br = new BufferedReader(new FileReader("file.txt"))) {
    StringBuilder sb = new StringBuilder();
    String line = br.readLine();

    while (line != null) {
        sb.append(line);
        sb.append(System.lineSeparator());
        line = br.readLine();
    }
    String everything = sb.toString();
}

When I read strings like this, I usually want to do some string handling per line anyways, so then I go for this implementation.

Though if I want to actually just read a file into a String, I always use Apache Commons IO with the class IOUtils.toString() method. You can have a look at the source here:

http://www.docjar.com/html/api/org/apache/commons/io/IOUtils.java.html

FileInputStream inputStream = new FileInputStream("foo.txt");
try {
    String everything = IOUtils.toString(inputStream);
} finally {
    inputStream.close();
}

And even simpler with Java 7:

try(FileInputStream inputStream = new FileInputStream("foo.txt")) {     
    String everything = IOUtils.toString(inputStream);
    // do something with everything string
}
Facile answered 17/1, 2011 at 18:42 Comment(13)
I've made a small adjustment to stop adding a newline ( \n ) if the last line is reached. code while (line != null) { sb.append(line); line = br.readLine(); // Only add new line when curline is NOT the last line.. if(line != null) { sb.append("\n"); } }codeForsaken
I feel it is better to explicitly give the encoding option and use the FileInputStream rather than directly the FileReader ? See this question too #697126 `reader = new InputStreamReader(new FileInputStream("<filePath>", "UTF-8");Humpage
Similar to Apache Common IO IOUtils#toString() is sun.misc.IOUtils#readFully(), which is included in the Sun/Oracle JREs.Idioglossia
For performance always call sb.append('\n') in preference to sb.append("\n") as a char is appended to the StringBuilder faster than a StringIdioglossia
@Idioglossia I added your change to my example.Facile
FileReader may throw FileNotFoundException and BufferedRead may throw IOException so you have to catch them.Heid
Same consideration should be applied within finally too.Heid
I haven't added exception handling intentionally as you as the developer should always do the error handling that is proper for your application. Your compiler will assist those that does not see this themselves.Facile
there is no need to use readers directly and also no need for ioutils. java7 has built in methods to read an entire file/all lines: See docs.oracle.com/javase/7/docs/api/java/nio/file/… and docs.oracle.com/javase/7/docs/api/java/nio/file/…Should
shouldnt we close file input stream in last one you showed? I think its better to close the file input since there might be starters also here.Spandrel
why do you need sb.toString()? what is the type of sb which needs to be stringified?Gooch
please remove this while (line != null)Attractive
IOUtils.toString is deprecatedPirozzo
P
637

ASCII is a TEXT file so you would use Readers for reading. Java also supports reading from a binary file using InputStreams. If the files being read are huge then you would want to use a BufferedReader on top of a FileReader to improve read performance.

Go through this article on how to use a Reader

I'd also recommend you download and read this wonderful (yet free) book called Thinking In Java

In Java 7:

new String(Files.readAllBytes(...))

(docs) or

Files.readAllLines(...)

(docs)

In Java 8:

Files.lines(..).forEach(...)

(docs)

Pejoration answered 17/1, 2011 at 18:31 Comment(9)
Picking a Reader really depends on what you need the content of the file for. If the file is small(ish) and you need it all, it's faster (benchmarked by us: 1.8-2x) to just use a FileReader and read everything (or at least large enough chunks). If you're processing it line by line then go for the BufferedReader.Lorelle
I guess @Lorelle is referring to the method of reading entire file into memory explained here java2s.com/Tutorial/Java/0180__File/…Headwaiter
Will the line order be preserved when using "Files.lines(..).forEach(...)". My understanding is that the order will be arbitrary after this operation.Nadeen
Files.lines(…).forEach(…) does not preserve order of lines but is executed in parallel, @Dash. If the order is important, you can use Files.lines(…).forEachOrdered(…), which should preserve the order (did not verify though).Seemly
@Seemly this is interesting, but can you quote from the docs where it says that Files.lines(...).forEach(...) is executed in parallel? I thought this was only the case when you explicitly make the stream parallel using Files.lines(...).parallel().forEach(...).Statistical
My original formulation is not bulletproof, @KlitosKyriacou. The point is that forEach does not guarantee any order and the reason is easy parallelization. If order is to be preserved, use forEachOrdered.Seemly
@AravindR.Yarram by the way the link to the book "Thinking In Java" is no longer working!Hurley
Just to clarify the answer about the parallelization and process order. The Java doc says "The behavior of this operation is explicitly nondeterministic. For parallel stream pipelines, this operation does not guarantee to respect the encounter order of the stream, as doing so would sacrifice the benefit of parallelism. For any given element, the action may be performed at whatever time and in whatever thread the library chooses. If the action accesses shared state, it is responsible for providing the required synchronization."Salina
So my understanding is that the order is always the same unless parallelism is used.Salina
A
152

The easiest way is to use the Scanner class in Java and the FileReader object. Simple example:

Scanner in = new Scanner(new FileReader("filename.txt"));

Scanner has several methods for reading in strings, numbers, etc... You can look for more information on this on the Java documentation page.

For example reading the whole content into a String:

StringBuilder sb = new StringBuilder();
while(in.hasNext()) {
    sb.append(in.next());
}
in.close();
outString = sb.toString();

Also if you need a specific encoding you can use this instead of FileReader:

new InputStreamReader(new FileInputStream(fileUtf8), StandardCharsets.UTF_8)
Airlia answered 17/1, 2011 at 18:35 Comment(4)
while (in.hasNext()) { System.out.println (in.next()); }Tinner
@Hissain But much easier to use than BufferedReaderAirlia
Must Surround it with try CatchDavy
@JesusRamos Not really, why do you think so? What's easier about this than while ((line = br.readLine()) != null) { sb.append(line); }?Perlie
T
128

Here is a simple solution:

String content = new String(Files.readAllBytes(Paths.get("sample.txt")));

Or to read as list:

List<String> content = Files.readAllLines(Paths.get("sample.txt"))
Timisoara answered 29/1, 2015 at 16:24 Comment(2)
readAllLines requires Android O (>= 8.0).Kitkitchen
Related answer (same solution): https://mcmap.net/q/45897/-read-complete-file-without-using-loop-in-javaNetherlands
H
57

Here's another way to do it without using external libraries:

import java.io.File;
import java.io.FileReader;
import java.io.IOException;

public String readFile(String filename)
{
    String content = null;
    File file = new File(filename); // For example, foo.txt
    FileReader reader = null;
    try {
        reader = new FileReader(file);
        char[] chars = new char[(int) file.length()];
        reader.read(chars);
        content = new String(chars);
        reader.close();
    } catch (IOException e) {
        e.printStackTrace();
    } finally {
        if(reader != null){
            reader.close();
        }
    }
    return content;
}
Hooks answered 22/5, 2012 at 21:2 Comment(4)
or use "try-with-resources" try(FileReader reader = new FileReader(file))Quagga
I noticed the file.length(), How well does this work with utf-16 files?Lamarckism
This technique assumes that read() fills the buffer; that the number of chars equals the number of bytes; that the number of bytes fits into memory; and that the number of bytes fits into an integer. -1Perlie
@HermesTrismegistus I provided four reasons why it is wrong. StefanReich is perfectly correct to agree with me.Perlie
C
44

I had to benchmark the different ways. I shall comment on my findings but, in short, the fastest way is to use a plain old BufferedInputStream over a FileInputStream. If many files must be read then three threads will reduce the total execution time to roughly half, but adding more threads will progressively degrade performance until making it take three times longer to complete with twenty threads than with just one thread.

The assumption is that you must read a file and do something meaningful with its contents. In the examples here is reading lines from a log and count the ones which contain values that exceed a certain threshold. So I am assuming that the one-liner Java 8 Files.lines(Paths.get("/path/to/file.txt")).map(line -> line.split(";")) is not an option.

I tested on Java 1.8, Windows 7 and both SSD and HDD drives.

I wrote six different implementations:

rawParse: Use BufferedInputStream over a FileInputStream and then cut lines reading byte by byte. This outperformed any other single-thread approach, but it may be very inconvenient for non-ASCII files.

lineReaderParse: Use a BufferedReader over a FileReader, read line by line, split lines by calling String.split(). This is approximatedly 20% slower that rawParse.

lineReaderParseParallel: This is the same as lineReaderParse, but it uses several threads. This is the fastest option overall in all cases.

nioFilesParse: Use java.nio.files.Files.lines()

nioAsyncParse: Use an AsynchronousFileChannel with a completion handler and a thread pool.

nioMemoryMappedParse: Use a memory-mapped file. This is really a bad idea yielding execution times at least three times longer than any other implementation.

These are the average times for reading 204 files of 4 MB each on an quad-core i7 and SSD drive. The files are generated on the fly to avoid disk caching.

rawParse                11.10 sec
lineReaderParse         13.86 sec
lineReaderParseParallel  6.00 sec
nioFilesParse           13.52 sec
nioAsyncParse           16.06 sec
nioMemoryMappedParse    37.68 sec

I found a difference smaller than I expected between running on an SSD or an HDD drive being the SSD approximately 15% faster. This may be because the files are generated on an unfragmented HDD and they are read sequentially, therefore the spinning drive can perform nearly as an SSD.

I was surprised by the low performance of the nioAsyncParse implementation. Either I have implemented something in the wrong way or the multi-thread implementation using NIO and a completion handler performs the same (or even worse) than a single-thread implementation with the java.io API. Moreover the asynchronous parse with a CompletionHandler is much longer in lines of code and tricky to implement correctly than a straight implementation on old streams.

Now the six implementations followed by a class containing them all plus a parametrizable main() method that allows to play with the number of files, file size and concurrency degree. Note that the size of the files varies plus minus 20%. This is to avoid any effect due to all the files being of exactly the same size.

rawParse

public void rawParse(final String targetDir, final int numberOfFiles) throws IOException, ParseException {
    overrunCount = 0;
    final int dl = (int) ';';
    StringBuffer lineBuffer = new StringBuffer(1024);
    for (int f=0; f<numberOfFiles; f++) {
        File fl = new File(targetDir+filenamePreffix+String.valueOf(f)+".txt");
        FileInputStream fin = new FileInputStream(fl);
        BufferedInputStream bin = new BufferedInputStream(fin);
        int character;
        while((character=bin.read())!=-1) {
            if (character==dl) {

                // Here is where something is done with each line
                doSomethingWithRawLine(lineBuffer.toString());
                lineBuffer.setLength(0);
            }
            else {
                lineBuffer.append((char) character);
            }
        }
        bin.close();
        fin.close();
    }
}

public final void doSomethingWithRawLine(String line) throws ParseException {
    // What to do for each line
    int fieldNumber = 0;
    final int len = line.length();
    StringBuffer fieldBuffer = new StringBuffer(256);
    for (int charPos=0; charPos<len; charPos++) {
        char c = line.charAt(charPos);
        if (c==DL0) {
            String fieldValue = fieldBuffer.toString();
            if (fieldValue.length()>0) {
                switch (fieldNumber) {
                    case 0:
                        Date dt = fmt.parse(fieldValue);
                        fieldNumber++;
                        break;
                    case 1:
                        double d = Double.parseDouble(fieldValue);
                        fieldNumber++;
                        break;
                    case 2:
                        int t = Integer.parseInt(fieldValue);
                        fieldNumber++;
                        break;
                    case 3:
                        if (fieldValue.equals("overrun"))
                            overrunCount++;
                        break;
                }
            }
            fieldBuffer.setLength(0);
        }
        else {
            fieldBuffer.append(c);
        }
    }
}

lineReaderParse

public void lineReaderParse(final String targetDir, final int numberOfFiles) throws IOException, ParseException {
    String line;
    for (int f=0; f<numberOfFiles; f++) {
        File fl = new File(targetDir+filenamePreffix+String.valueOf(f)+".txt");
        FileReader frd = new FileReader(fl);
        BufferedReader brd = new BufferedReader(frd);

        while ((line=brd.readLine())!=null)
            doSomethingWithLine(line);
        brd.close();
        frd.close();
    }
}

public final void doSomethingWithLine(String line) throws ParseException {
    // Example of what to do for each line
    String[] fields = line.split(";");
    Date dt = fmt.parse(fields[0]);
    double d = Double.parseDouble(fields[1]);
    int t = Integer.parseInt(fields[2]);
    if (fields[3].equals("overrun"))
        overrunCount++;
}

lineReaderParseParallel

public void lineReaderParseParallel(final String targetDir, final int numberOfFiles, final int degreeOfParalelism) throws IOException, ParseException, InterruptedException {
    Thread[] pool = new Thread[degreeOfParalelism];
    int batchSize = numberOfFiles / degreeOfParalelism;
    for (int b=0; b<degreeOfParalelism; b++) {
        pool[b] = new LineReaderParseThread(targetDir, b*batchSize, b*batchSize+b*batchSize);
        pool[b].start();
    }
    for (int b=0; b<degreeOfParalelism; b++)
        pool[b].join();
}

class LineReaderParseThread extends Thread {

    private String targetDir;
    private int fileFrom;
    private int fileTo;
    private DateFormat fmt = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
    private int overrunCounter = 0;

    public LineReaderParseThread(String targetDir, int fileFrom, int fileTo) {
        this.targetDir = targetDir;
        this.fileFrom = fileFrom;
        this.fileTo = fileTo;
    }

    private void doSomethingWithTheLine(String line) throws ParseException {
        String[] fields = line.split(DL);
        Date dt = fmt.parse(fields[0]);
        double d = Double.parseDouble(fields[1]);
        int t = Integer.parseInt(fields[2]);
        if (fields[3].equals("overrun"))
            overrunCounter++;
    }

    @Override
    public void run() {
        String line;
        for (int f=fileFrom; f<fileTo; f++) {
            File fl = new File(targetDir+filenamePreffix+String.valueOf(f)+".txt");
            try {
            FileReader frd = new FileReader(fl);
            BufferedReader brd = new BufferedReader(frd);
            while ((line=brd.readLine())!=null) {
                doSomethingWithTheLine(line);
            }
            brd.close();
            frd.close();
            } catch (IOException | ParseException ioe) { }
        }
    }
}

nioFilesParse

public void nioFilesParse(final String targetDir, final int numberOfFiles) throws IOException, ParseException {
    for (int f=0; f<numberOfFiles; f++) {
        Path ph = Paths.get(targetDir+filenamePreffix+String.valueOf(f)+".txt");
        Consumer<String> action = new LineConsumer();
        Stream<String> lines = Files.lines(ph);
        lines.forEach(action);
        lines.close();
    }
}


class LineConsumer implements Consumer<String> {

    @Override
    public void accept(String line) {

        // What to do for each line
        String[] fields = line.split(DL);
        if (fields.length>1) {
            try {
                Date dt = fmt.parse(fields[0]);
            }
            catch (ParseException e) {
            }
            double d = Double.parseDouble(fields[1]);
            int t = Integer.parseInt(fields[2]);
            if (fields[3].equals("overrun"))
                overrunCount++;
        }
    }
}

nioAsyncParse

public void nioAsyncParse(final String targetDir, final int numberOfFiles, final int numberOfThreads, final int bufferSize) throws IOException, ParseException, InterruptedException {
    ScheduledThreadPoolExecutor pool = new ScheduledThreadPoolExecutor(numberOfThreads);
    ConcurrentLinkedQueue<ByteBuffer> byteBuffers = new ConcurrentLinkedQueue<ByteBuffer>();

    for (int b=0; b<numberOfThreads; b++)
        byteBuffers.add(ByteBuffer.allocate(bufferSize));

    for (int f=0; f<numberOfFiles; f++) {
        consumerThreads.acquire();
        String fileName = targetDir+filenamePreffix+String.valueOf(f)+".txt";
        AsynchronousFileChannel channel = AsynchronousFileChannel.open(Paths.get(fileName), EnumSet.of(StandardOpenOption.READ), pool);
        BufferConsumer consumer = new BufferConsumer(byteBuffers, fileName, bufferSize);
        channel.read(consumer.buffer(), 0l, channel, consumer);
    }
    consumerThreads.acquire(numberOfThreads);
}


class BufferConsumer implements CompletionHandler<Integer, AsynchronousFileChannel> {

        private ConcurrentLinkedQueue<ByteBuffer> buffers;
        private ByteBuffer bytes;
        private String file;
        private StringBuffer chars;
        private int limit;
        private long position;
        private DateFormat frmt = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");

        public BufferConsumer(ConcurrentLinkedQueue<ByteBuffer> byteBuffers, String fileName, int bufferSize) {
            buffers = byteBuffers;
            bytes = buffers.poll();
            if (bytes==null)
                bytes = ByteBuffer.allocate(bufferSize);

            file = fileName;
            chars = new StringBuffer(bufferSize);
            frmt = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
            limit = bufferSize;
            position = 0l;
        }

        public ByteBuffer buffer() {
            return bytes;
        }

        @Override
        public synchronized void completed(Integer result, AsynchronousFileChannel channel) {

            if (result!=-1) {
                bytes.flip();
                final int len = bytes.limit();
                int i = 0;
                try {
                    for (i = 0; i < len; i++) {
                        byte by = bytes.get();
                        if (by=='\n') {
                            // ***
                            // The code used to process the line goes here
                            chars.setLength(0);
                        }
                        else {
                                chars.append((char) by);
                        }
                    }
                }
                catch (Exception x) {
                    System.out.println(
                        "Caught exception " + x.getClass().getName() + " " + x.getMessage() +
                        " i=" + String.valueOf(i) + ", limit=" + String.valueOf(len) +
                        ", position="+String.valueOf(position));
                }

                if (len==limit) {
                    bytes.clear();
                    position += len;
                    channel.read(bytes, position, channel, this);
                }
                else {
                    try {
                        channel.close();
                    }
                    catch (IOException e) {
                    }
                    consumerThreads.release();
                    bytes.clear();
                    buffers.add(bytes);
                }
            }
            else {
                try {
                    channel.close();
                }
                catch (IOException e) {
                }
                consumerThreads.release();
                bytes.clear();
                buffers.add(bytes);
            }
        }

        @Override
        public void failed(Throwable e, AsynchronousFileChannel channel) {
        }
};

FULL RUNNABLE IMPLEMENTATION OF ALL CASES

https://github.com/sergiomt/javaiobenchmark/blob/master/FileReadBenchmark.java

Comte answered 14/11, 2016 at 20:20 Comment(0)
W
29

Here are the three working and tested methods:

Using BufferedReader

package io;
import java.io.*;
public class ReadFromFile2 {
    public static void main(String[] args)throws Exception {
        File file = new File("C:\\Users\\pankaj\\Desktop\\test.java");
        BufferedReader br = new BufferedReader(new FileReader(file));
        String st;
        while((st=br.readLine()) != null){
            System.out.println(st);
        }
    }
}

Using Scanner

package io;

import java.io.File;
import java.util.Scanner;

public class ReadFromFileUsingScanner {
    public static void main(String[] args) throws Exception {
        File file = new File("C:\\Users\\pankaj\\Desktop\\test.java");
        Scanner sc = new Scanner(file);
        while(sc.hasNextLine()){
            System.out.println(sc.nextLine());
        }
    }
}

Using FileReader

package io;
import java.io.*;
public class ReadingFromFile {

    public static void main(String[] args) throws Exception {
        FileReader fr = new FileReader("C:\\Users\\pankaj\\Desktop\\test.java");
        int i;
        while ((i=fr.read()) != -1){
            System.out.print((char) i);
        }
    }
}

Read the entire file without a loop using the Scanner class

package io;

import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;

public class ReadingEntireFileWithoutLoop {

    public static void main(String[] args) throws FileNotFoundException {
        File file = new File("C:\\Users\\pankaj\\Desktop\\test.java");
        Scanner sc = new Scanner(file);
        sc.useDelimiter("\\Z");
        System.out.println(sc.next());
    }
}
Watford answered 10/1, 2017 at 18:52 Comment(2)
How to give path if the folders are present inside the project?Inca
What about java.nio.file.Files? We can now just use readAllLines, readAllBytes, and lines.Bill
L
20

The methods within org.apache.commons.io.FileUtils may also be very handy, e.g.:

/**
 * Reads the contents of a file line by line to a List
 * of Strings using the default encoding for the VM.
 */
static List readLines(File file)
Letter answered 17/1, 2011 at 18:46 Comment(3)
Or if you prefer Guava (a more modern, actively maintained library), it has similar utilities in its Files class. Simple examples in this answer.Strathspey
or you simply use the built in method to get all lines: docs.oracle.com/javase/7/docs/api/java/nio/file/…Should
Link on apache commons seems dead.Secondclass
C
18

I documented 15 ways to read a file in Java and then tested them for speed with various file sizes - from 1 KB to 1 GB and here are the top three ways to do this:

  1. java.nio.file.Files.readAllBytes()

    Tested to work in Java 7, 8, and 9.

    import java.io.File;
    import java.io.IOException;
    import java.nio.file.Files;
    
    public class ReadFile_Files_ReadAllBytes {
      public static void main(String [] pArgs) throws IOException {
        String fileName = "c:\\temp\\sample-10KB.txt";
        File file = new File(fileName);
    
        byte [] fileBytes = Files.readAllBytes(file.toPath());
        char singleChar;
        for(byte b : fileBytes) {
          singleChar = (char) b;
          System.out.print(singleChar);
        }
      }
    }
    
  2. java.io.BufferedReader.readLine()

    Tested to work in Java 7, 8, 9.

    import java.io.BufferedReader;
    import java.io.FileReader;
    import java.io.IOException;
    
    public class ReadFile_BufferedReader_ReadLine {
      public static void main(String [] args) throws IOException {
        String fileName = "c:\\temp\\sample-10KB.txt";
        FileReader fileReader = new FileReader(fileName);
    
        try (BufferedReader bufferedReader = new BufferedReader(fileReader)) {
          String line;
          while((line = bufferedReader.readLine()) != null) {
            System.out.println(line);
          }
        }
      }
    }
    
  3. java.nio.file.Files.lines()

    This was tested to work in Java 8 and 9 but won't work in Java 7 because of the lambda expression requirement.

    import java.io.File;
    import java.io.IOException;
    import java.nio.file.Files;
    import java.util.stream.Stream;
    
    public class ReadFile_Files_Lines {
      public static void main(String[] pArgs) throws IOException {
        String fileName = "c:\\temp\\sample-10KB.txt";
        File file = new File(fileName);
    
        try (Stream linesStream = Files.lines(file.toPath())) {
          linesStream.forEach(line -> {
            System.out.println(line);
          });
        }
      }
    }
    
Cassel answered 7/4, 2018 at 16:41 Comment(0)
W
17

What do you want to do with the text? Is the file small enough to fit into memory? I would try to find the simplest way to handle the file for your needs. The FileUtils library is very handle for this.

for(String line: FileUtils.readLines("my-text-file"))
    System.out.println(line);
Willawillabella answered 17/1, 2011 at 22:33 Comment(4)
it's also built into java7: docs.oracle.com/javase/7/docs/api/java/nio/file/…Should
@PeterLawrey probably means org.apache.commons.io.FileUtils. Google link may change content over time, as the most widespread meaning shifts, but this matches his query and looks correct.Seemly
Unfortunately, nowadays there is no readLines(String) and readLines(File) is deprecated in favor of readLines(File, Charset). The encoding can be supplied also as a string.Seemly
A marginally older answer suggesting the same methodSeemly
A
10

The most intuitive method is introduced in Java 11 Files.readString

import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;

public class App {
    public static void main(String args[]) throws IOException {
        String content = Files.readString(Paths.get("D:\\sandbox\\mvn\\my-app\\my-app.iml"));
        System.out.print(content);
    }
}

PHP has this luxury for decades! ☺

Apulia answered 12/3, 2020 at 9:32 Comment(0)
L
9

Below is a one-liner of doing it in the Java 8 way. Assuming text.txt file is in the root of the project directory of the Eclipse.

Files.lines(Paths.get("text.txt")).collect(Collectors.toList());
Lamphere answered 15/11, 2016 at 17:7 Comment(0)
M
8

The buffered stream classes are much more performant in practice, so much so that the NIO.2 API includes methods that specifically return these stream classes, in part to encourage you always to use buffered streams in your application.

Here is an example:

Path path = Paths.get("/myfolder/myfile.ext");
try (BufferedReader reader = Files.newBufferedReader(path)) {
    // Read from the stream
    String currentLine = null;
    while ((currentLine = reader.readLine()) != null)
        //do your code here
} catch (IOException e) {
    // Handle file I/O exception...
}

You can replace this code

BufferedReader reader = Files.newBufferedReader(path);

with

BufferedReader br = new BufferedReader(new FileReader("/myfolder/myfile.ext"));

I recommend this article to learn the main uses of Java NIO and IO.

Muddler answered 26/9, 2018 at 14:41 Comment(0)
G
7

Using BufferedReader:

import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;

BufferedReader br;
try {
    br = new BufferedReader(new FileReader("/fileToRead.txt"));
    try {
        String x;
        while ( (x = br.readLine()) != null ) {
            // Printing out each line in the file
            System.out.println(x);
        }
    }
    catch (IOException e) {
        e.printStackTrace();
    }
}
catch (FileNotFoundException e) {
    System.out.println(e);
    e.printStackTrace();
}
Groomsman answered 26/12, 2015 at 20:17 Comment(0)
B
7

This is basically the exact same as Jesus Ramos' answer, except with File instead of FileReader plus iteration to step through the contents of the file.

Scanner in = new Scanner(new File("filename.txt"));

while (in.hasNext()) { // Iterates each line in the file
    String line = in.nextLine();
    // Do something with line
}

in.close(); // Don't forget to close resource leaks

... throws FileNotFoundException

Biting answered 19/7, 2016 at 5:13 Comment(1)
File vs FileReader: With a FileReader, the file must exist and operating system permissions must permit access. With a File, it is possible to test those permissions or check if the file is a directory. File has useful functions: isFile(), isDirectory(), listFiles(), canExecute(), canRead(), canWrite(), exists(), mkdir(), delete(). File.createTempFile() writes to the system default temp directory. This method will return a file object that can be used to open FileOutputStream objects, etc. sourceBiting
P
6

Probably not as fast as with buffered I/O, but quite terse:

    String content;
    try (Scanner scanner = new Scanner(textFile).useDelimiter("\\Z")) {
        content = scanner.next();
    }

The \Z pattern tells the Scanner that the delimiter is EOF.

Pigg answered 31/12, 2014 at 13:0 Comment(3)
A very related, already existing answer is by Jesus Ramos.Seemly
True, should be: if(scanner.hasNext()) content = scanner.next();Pigg
This fails for me on Android 4.4. Only 1024 bytes are read. YMMV.Julian
C
3

The most simple way to read data from a file in Java is making use of the File class to read the file and the Scanner class to read the content of the file.

public static void main(String args[])throws Exception
{
   File f = new File("input.txt");
   takeInputIn2DArray(f);
}

public static void takeInputIn2DArray(File f) throws Exception
{
    Scanner s = new Scanner(f);
    int a[][] = new int[20][20];
    for(int i=0; i<20; i++)
    {
        for(int j=0; j<20; j++)
        {
            a[i][j] = s.nextInt();
        }
    }
}

PS: Don't forget to import java.util.*; for Scanner to work.

Christenechristening answered 2/2, 2015 at 14:48 Comment(0)
B
3

You can use readAllLines and the join method to get whole file content in one line:

String str = String.join("\n",Files.readAllLines(Paths.get("e:\\text.txt")));

It uses UTF-8 encoding by default, which reads ASCII data correctly.

Also you can use readAllBytes:

String str = new String(Files.readAllBytes(Paths.get("e:\\text.txt")), StandardCharsets.UTF_8);

I think readAllBytes is faster and more precise, because it does not replace new line with \n and also new line may be \r\n. It is depending on your needs which one is suitable.

Brainbrainard answered 4/2, 2018 at 8:30 Comment(0)
D
2

I don't see it mentioned yet in the other answers so far. But if "Best" means speed, then the new Java I/O (NIO) might provide the fastest preformance, but not always the easiest to figure out for someone learning.

http://download.oracle.com/javase/tutorial/essential/io/file.html

Dovap answered 17/1, 2011 at 19:45 Comment(1)
You should have stated how its done and not to give a link to followRoche
T
2

Guava provides a one-liner for this:

import com.google.common.base.Charsets;
import com.google.common.io.Files;

String contents = Files.toString(filePath, Charsets.UTF_8);
Topo answered 12/10, 2016 at 7:6 Comment(0)
C
2

This might not be the exact answer to the question. It's just another way of reading a file where you do not explicitly specify the path to your file in your Java code and instead, you read it as a command-line argument.

With the following code,

import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.io.IOException;

public class InputReader{

    public static void main(String[] args)throws IOException{
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
        String s="";
        while((s=br.readLine())!=null){
            System.out.println(s);
        }
    }
}

just go ahead and run it with:

java InputReader < input.txt

This would read the contents of the input.txt and print it to the your console.

You can also make your System.out.println() to write to a specific file through the command line as follows:

java InputReader < input.txt > output.txt

This would read from input.txt and write to output.txt.

Caia answered 16/5, 2017 at 13:29 Comment(0)
A
2

Cactoos give you a declarative one-liner:

new TextOf(new File("a.txt")).asString();
Allomerism answered 27/8, 2017 at 12:54 Comment(0)
C
1

For JSF-based Maven web applications, just use ClassLoader and the Resources folder to read in any file you want:

  1. Put any file you want to read in the Resources folder.
  2. Put the Apache Commons IO dependency into your POM:

    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-io</artifactId>
        <version>1.3.2</version>
    </dependency>
    
  3. Use the code below to read it (e.g. below is reading in a .json file):

    String metadata = null;
    FileInputStream inputStream;
    try {
    
        ClassLoader loader = Thread.currentThread().getContextClassLoader();
        inputStream = (FileInputStream) loader
                .getResourceAsStream("/metadata.json");
        metadata = IOUtils.toString(inputStream);
        inputStream.close();
    }
    catch (FileNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    return metadata;
    

You can do the same for text files, .properties files, XSD schemas, etc.

Crabbe answered 11/4, 2015 at 8:3 Comment(1)
You can't use this on 'any file you want'. You can only use it for resources that have been packaged into the JAR or WAR file.Perlie
G
1
try {
  File f = new File("filename.txt");
  Scanner r = new Scanner(f);  
  while (r.hasNextLine()) {
    String data = r.nextLine();
    JOptionPane.showMessageDialog(data);
  }
  r.close();
} catch (FileNotFoundException ex) {
  JOptionPane.showMessageDialog("Error occurred");
  ex.printStackTrace();
}
Godfather answered 23/12, 2019 at 13:26 Comment(0)
R
0

Use Java kiss if this is about simplicity of structure:

import static kiss.API.*;

class App {
  void run() {
    String line;
    try (Close in = inOpen("file.dat")) {
      while ((line = readLine()) != null) {
        println(line);
      }
    }
  }
}
Riojas answered 13/8, 2016 at 7:28 Comment(0)
M
0
import java.util.stream.Stream;
import java.nio.file.*;
import java.io.*;

class ReadFile {

 public static void main(String[] args) {

    String filename = "Test.txt";

    try(Stream<String> stream = Files.lines(Paths.get(filename))) {

          stream.forEach(System.out:: println);

    } catch (IOException e) {

        e.printStackTrace();
    }

 }

 }

Just use java 8 Stream.

Musette answered 16/9, 2019 at 10:24 Comment(0)
L
0

In case you have a large file you can use Apache Commons IO to process the file iteratively without exhausting the available memory.

try (LineIterator it = FileUtils.lineIterator(theFile, "UTF-8")) {
    while (it.hasNext()) {
        String line = it.nextLine();
        // do something with line
    }
}
Loudspeaker answered 1/3, 2022 at 14:58 Comment(0)
M
0
try (Stream<String> stream = Files.lines(Paths.get(String.valueOf(new File("yourFile.txt"))))) {
    stream.forEach(System.out::println);
} catch (IOException e) {
    e.printStackTrace();
}

new File(<path_name>)

Creates a new File instance by converting the given pathname string into an abstract pathname. If the given string is the empty string, then the result is the empty abstract pathname. Params: pathname – A pathname string Throws: NullPointerException – If the pathname argument is null

Files.lines returns a stream of String

Stream<String> stream = Files.lines(Paths.get(String.valueOf(new File("yourFile.txt")))) can throw nullPointerExcetion , FileNotFoundException so, keepint it inside try will take care of Exception in runtime

stream.forEach(System.out::println);

This is used to iterate over the stream and print in console If you have different use case you can provide your custome function to manipulate the stream of lines

Mcwhirter answered 18/6, 2022 at 14:10 Comment(2)
Please read How do I write a good answer?. While this code block may answer the OP's question, this answer would be much more useful if you explain how this code is different from the code in the question, what you've changed, why you've changed it and why that solves the problem without introducing others.Regale
@SaeedZhiany apologies ! I didnt checked the guidelines to write good answer in stackoverflow ! i will check and edit the answer properly ! Thank you for showing me the proper way :)Mcwhirter
C
0

My new favorite approach to simply read a whole text file from a BufferedReader input goes:

String text = input.lines().collect(Collectors.joining(System.lineSeparator())));

This will read the whole file by adding new line (lineSeparator) behind each line. Without the separator it would join all lines together as one. This appears to have existed since Java 8.

Cleanser answered 21/7, 2022 at 11:58 Comment(0)
S
0

For Android developers ending up here (who use Kotlin):

val myFileUrl = object{}.javaClass.getResource("/vegetables.txt")
val text = myFileUrl.readText() // Not recommended for huge files
println(text)

Other solution:

val myFileUrl = object{}.javaClass.getResource("/vegetables.txt")
val file = File(myFileUrl.toURI())
val lines = file.readLines() // Not recommended for huge files
lines.forEach(::println)

Another good solution which can be used for huge files as well:

val myFileUrl = object{}.javaClass.getResource("/vegetables.txt")
val file = File(myFileUrl.toURI())
file
    .bufferedReader()
    .lineSequence()
    .forEach(::println)

Or:

val myFileUrl = object{}.javaClass.getResource("/vegetables.txt")
val file = File(myFileUrl.toURI())
file.useLines { lines ->
    lines.forEach(::println)
}

Notes:

  • The vegetables.txt file should be in your classpath (for example, in src/main/resources directory)

  • The above solutions all treat the file encodings as UTF-8 by default. You can specify your desired encoding as the argument for the functions.

  • The above solutions do not need any further action like closing the files or readers. They are automatically taken care of by the Kotlin standard library.

Shafer answered 16/12, 2022 at 10:10 Comment(0)
B
-2

This code I programmed is much faster for very large files:

public String readDoc(File f) {
    String text = "";
    int read, N = 1024 * 1024;
    char[] buffer = new char[N];

    try {
        FileReader fr = new FileReader(f);
        BufferedReader br = new BufferedReader(fr);

        while(true) {
            read = br.read(buffer, 0, N);
            text += new String(buffer, 0, read);

            if(read < N) {
                break;
            }
        }
    } catch(Exception ex) {
        ex.printStackTrace();
    }

    return text;
}
Bless answered 21/5, 2012 at 23:19 Comment(7)
Much faster, I doubt it, if you use simple string concatenation instead of a StringBuilder...Lyallpur
I think the main speed gain is from reading in 1MB (1024 * 1024) blocks. However you could do the same simply by passing 1024 * 1024 as second arg to BufferedReader constructor.Idioglossia
i don't believe this is tested at all. using += in this way gives you quadratic (!) complexity for a task that should be linear complexity. this will start to crawl for files over a few mb. to get around this you should either keep the textblocks in a list<string> or use the aforementioned stringbuilder.Should
Much faster than what? It most certainly is not faster than appending to a StringBuffer. -1Perlie
@Idioglossia I thought the same about buffer sizes, but the detailed experiment in this question gave surprising results in a similar context: a 16KB buffer was consistently and noticeably faster.Fronnia
@bcsb1001 Rubbish. They are not 'better'. They are merely non-thread-safe, an issue which doesn't arise here as the builder/buffer is method-local. Appending to a a string is very significantly slower than StringBuilder, which is why it exists, and which a simple, test will readily demonstrate. You also need to address the fact that this code doesn't handle end of stream correctly, or indeed all.Perlie
@EJP Sorry about that comment, that was in 2015. Still, as the Javadoc of StringBuffer says, "The StringBuilder class should generally be used in preference to this one, as it supports all of the same operations but it is faster, as it performs no synchronization." And synchronisation != thread-safety. But I agree with you that appending to a StringBuilder is faster than appending to a String, since it was designed to be mutable.Driblet

© 2022 - 2024 — McMap. All rights reserved.