Skip to Content

Ten Things I Love About Java 17

Ten features to help justify a move from Java 8 to Java 17

Posted on
Photo by Tim Mossholder on Unsplash
Photo by Tim Mossholder on Unsplash

Java 17 has been out for about a month now and I keep finding more and more things I like about it. If you are still on Java 8, moving to Java 17 might seem a bit overwhelming due to the sheer number of changes. With that in mind, here are ten of the features in Java 17 that I find most compelling for a day-to-day Java developer. These are features you can use today to make your code more clear and perform better.

These features aren’t in a particular order (I used the new RandomGenerator to decide the order for me), so let’s dive right in.

Helpful NullPointerExceptions

Nobody likes to have a NullPointerException in their code, so what’s all this talk about them being “helpful”? To show what I mean, let’s look at some code.

String street = getAccount().getUser().getPrimaryAddress().getStreet().trim();

If we run this using Java 8 or Java 11, the code might throw this exception:

Exception in thread "main" java.lang.NullPointerException
	at Main.main(Main.java:8)

Where’s the null reference? The account? The user? The primary address? The street? It could be any of them. We’d probably have to resort to a debugger or some printlns to find out for sure. But now, thanks to JEP-358 , we can get more detail:

Exception in thread "main" java.lang.NullPointerException: Cannot invoke "Address.getStreet()" 
  because the return value of "User.getPrimaryAddress()" is null
	at Main.main(Main.java:8)

Aha! Now it’s clear that getPrimaryAddress() is returning null, and we can’t call getStreet() on a null Address! This feature is on by default in Java 17, but it can be turned off via the -XX:-ShowCodeDetailsInExceptionMessages flag if you have a use case where you need the old format (for parsing logs, perhaps).

Record Classes

Of all the features I’ve written about here, records are the one I’m the most excited about. Not only for what it can do for our code today, but for what it will do for both our code and libraries we use in the future. Java records are immutable data carriers. Now, of course we could write immutable data carriers using any version of Java we want. But records allow us to quickly and succinctly define a class for our immutable data without a lot of ceremony.

Records should be thought of as immutable (can’t change), transparrent (we can see the fields), data holders (no behaviors).

For example, if we were to write a Person class before records, we would have to write something like this:

public class Person {                                                          
    final String name;                                                         
    final int age;                                                             
                                                                               
    public Person(String name, int age) {                                      
        this.name = name;                                                      
        this.age = age;                                                        
    }                                                                          
                                                                               
    public String getName() {                                                  
        return name;                                                           
    }                                                                          
                                                                               
    public int getAge() {                                                      
        return age;                                                            
    }                                                                          
                                                                               
    @Override                                                                  
    public boolean equals(Object o) {                                          
        if (this == o) return true;                                            
        if (o == null || getClass() != o.getClass()) return false;             
        Person person = (Person) o;                                            
        return age == person.age && Objects.equals(name, person.name);         
    }                                                                          
                                                                               
    @Override                                                                  
    public int hashCode() {                                                    
        return Objects.hash(name, age);                                        
    }                                                                          
                                                                               
    @Override                                                                  
    public String toString() {                                                 
        return "Person{" +                                                     
                "name='" + name + '\'' +                                       
                ", age=" + age +                                               
                '}';                                                           
    }                                                                          
}                                                                              

We’ve probably all written classes like this. More accurately, we’ve probably had our IDE write classes like this. Anything after the first couple of lines is boilerplate and can mentally be ignored. Maybe. Perhaps somebody has done something “interesting” in one of the methods. Now we have to read it to make sure. This is cognitive load on us that we probably don’t need to spend our brain cycles on.

The same Person, with the same two properties as a record would look like this:

public record Person(String name, int age) { }

That’s it! Java provides us everything we need: an all-arguments constructor, public getters for our two private fields, and sensible implementations of toString(), hashCode(), and equals(). It’s much easier to read and we know exactly what we’re getting. One difference to note, however, is the naming of the getters that are generated. Rather than getName() or getAge() as we have in our original class, in Java records these are called name() and age(). Not a big deal, just something to get used to. I should also mention that while records come with sensible defaults for some methods, we can still override them if we want. There are some restrictions when using records. For example, all records implicitly extend java.lang.Record and are final. That means records cannot extend any class other than Record, nor can they be extended. We also can’t define any new fields, only what we specify in the header.

constructing records works just like we’re used to with regular classes:

final Person person = new Person("Stephen Falken", 58);

We can implement whatever interfaces we like, and add new methods (but not new fields) to our record:

public record Person(String name, int age) {
    public boolean isTeenager() {
        return age >= 13 && age < 20;
    }
}

Even though records give us a default constructor, we can also define our own constructors if we want to do some extra validation, for example:

public record Person(String name, int age) {
    public Person(String name, int age) {
        Objects.requireNonNull(name);
        this.name = name;
        this.age = age;
    }
}

Since that constructor is using all of the fields we’ve defined in the order we defined them, we can simplify that further:

public record Person(String name, int age) {
    public Person {
        Objects.requireNonNull(name);
    }
}

Both of those constructors do the same thing - validate our input and initialize the fields. The latter, a compact form constructor, does most if this implicitly by hiding the repeated types and the variable initialization.

We could also define a completely different kind of constructor, if we want:

public record Person(String name, int age) {
    public Person(User user) {
        this(user.name(), user.age());
    }
}

Be aware that while records are immutable, they are only shalowly immutable. That means if we have a field in a record and that field is not also immutable, then things can change. For example, if we have a record that contains a List<String> as one of it’s fields, we might be able to add or remove items from that list. The immutability is therefore “shallow”.

So where do we use records?

  • DTOs- any request/response objects from a REST API are good candidates since they are already transparent immutable data holders (or probably should be).
  • Immutable generic tuple classes can be replaced with a record. Replace any kind of Pair<A,B> you might have written with a purpose-built record.
  • JPA Projections - while records can’t be used for entities, they can be used for projections.

And this is just the start, there are more changes coming for records in the future. Records don’t currently support destructuring or “withers” (imagine: Person newPerson = existingPerson.withAge(21)), but there’s no reason those things can’t be added in a future version of Java without breaking existing compatibility. Records are very useful now, so we should start using them where appropriate.

Switch Expressions

Up until now, switches in Java have been statements. That is, they themselves don’t return a value and anything they do is purely a side-effect. What’s changed in Java 17 is that now we can (if we want) treat a switch as an expression, something that directly returns a value.

For example, if we wanted to use a switch to turn a String into a number, we might write something like this:

public int convertToInt(final String numberAsString) {
    int asInt = 0;
    switch (numberAsString) {
        case "one":
            asInt = 1;
            break;
        case "two":
            asInt = 2;
            break;
        default:
            asInt = 42;

    }
    return asInt;
}

First, we declare asInt with a default value. Then we use the switch to evaluate all of the cases we care about, including a default value. In each of the case blocks we set the value of asInt as appropriate and break so we don’t fall through to the next case statement. That all works fine, but it’s a bit verbose. My biggest problem with a switch statement is forgetting to put in the break, which leads to bugs.

With switch expressions, we can simplify this:

public int convertToInt(final String numberAsString) {
    return switch (numberAsString) {
        case "one" -> 1;
        case "two" -> 2;
        default -> 42;
    };
}

Because the switch itself produces a value we can skip setting up the asInt variable. Instead, we can return the value the switch produces directly. One thing to keep in mind here is that switch expressions must be exhaustive, meaning every input must produce an output. If we had skipped adding the default case, this example would not compile. You might notice the new -> syntax involved in switch expressions. By default an “arrow case” does not require a break and will not fall through.

My favorite part - not only are switch expressions easier to read, they reduce the chance of bugs because cases don’t fall through automatically!

For a more thorough explanation of all the ins and outs of switch expression, check out Oracle’s switch expression documentation .

New Garbage Collectors

Before we talk about garbage collectors, let’s talk about garbage collection in general. In most modern garbage collector algorithms, there are three phases. First, the garbage collector will mark objects that are no longer reachable. Second, the garbage collector will sweep (or remove) the objects that have been marked. Over time, the heap will become fragmented and even though we have tons of memory left, it might be in such small slices that we’ll have no contiguous region to allocate a large object. In the third phase, the garbage collector will compact the remaining objects in order to use the heap space more optimally.

When writing a garbage collector algorithm the two main concerns to optimize for are throughput (the total time your application can spend doing your work instead of collecting garbage), and latency (if the garbage collector must “stop the world” to mark, sweep, or compact, how quickly can it do this?). Some algorithms might try to find a balance between throughput and latency, while another might favor one over the other. Sometimes garbage collectors have constraints on heap size in that they work better on either smaller or larger heaps.

Now that we have that foundation, we can talk about garbage collectors in Java 17.

G1GC

OK, G1GC isn’t new to Java 17, it’s been around for a while and you might have already used it in Java 8. I’m listing it here as “new” because under the covers, G1GC been receiving steady updates to make it better and faster than it was in Java 8. G1GC optimizes for throughput. To do this it will mark and sweep concurrently along side your application’s work, but will stop the world for its compaction phase. One setting for G1GC that we can play with is the time budget. We tell G1GC how long we’re willing to tolerate a full stop the world pause, and it will try its best to hit that budget. If we set the time budget too small, we might end up having more (but smaller) pauses than we like, so be careful when tuning this aspect.

ZGC

ZGC optimizes for throughput, and will concurrently mark, sweep, and compact. There are still some (very) slight stop the world pauses between these phases, but they are frequently less than 1ms. ZGC works well on heaps of nearly any size from 8MB (yes, you can run a Java app in a heap that small) all the way up to 16TB. No matter what the heap size, you should see constant, small pauses.

Shenendoah

On the surface Shenendoah might appear similar to ZGC - it optimizes for throughput, concurrently marks, sweeps, and compacts, and also has ultra-low pause times. The difference is that Shenendah is implemented using a different internal algorithm than ZGC (that I am not going to get into here). So if ZGC seems appealing, try Shenendoah as well because one might work better for your application than the other. I should note that Shenendoah is not available in every distribution of Java. The Oracle distributions, for example, do not include it. However, Red Hat, the original authors of Shenendoah have back-ported it to Java 11 and 8, so you can try it now if you’re not on Java 17 yet.

Epsilon

Epsilon is unique in that it doesn’t really do anything. It will allocate a heap of memory for your application but will never mark, sweep, or compact it. Now, that might not seem all that useful, but it does have its place. Suppose we’re trying to get that last drop of performance out of the JVM by not allocating anything once we reach a steady state. How could we prove that it works? We could use epsilon. Similarly, if we have a job that is very short lived, we might want to use epsilon so GC doesn’t get in the way at all. Epsilon is also useful for testing interfaces within the JVM. Epsilon is also useful for testing various aspects of the JVM itself without having to account for the garbage collector.

It also goes without saying that Epsilon is a fantastic pranking tool. I’m not here to say that enabling Epsilon as a small detail in an otherwise large PR on your last day before leaving a job is a good thing to do, but I’m also not not saying that either. ;)

As you can see, there are a lot of options for garbage collection and it can be overwhelming. None of these are a one-size-fits-all, and you should experiment using your actual application during a sustained performance test before deciding which algorithms and settings are right for your app.

Stream.toList()

How many times have you turned a Stream into a List? You probably used code like this to do it:

someList.stream()
    .map({ ... })
    .collect(Collectors.toList());  // <---- Old Way!

There’s nothing wrong with that code. It gets the job done. But for such a common use case, it’s a bit verbose. Now, there’s a more direct way to do this, thanks to Stream.toList():

someList.stream()
    .map({ ... })
    .toList();     // <---- New Way!

The new way is less verbose, but the older way still works if you prefer it.

There is one important different to note here, before using this new method. Stream.toList() returns an unmodifiable ArrayList, and Collectors.toList() returns a modifiable ArrayList. Personally, I like this as we should favor immutability over mutability whenever possible. So be careful when converting your code because you might end up with an UnsupportedOperationException if you try to modify the List returned from Stream.toList().

Pattern Matching for instanceof

Let’s imagine a scenario where we have a Java object whose type we suspect, but do not know. We want to test that the type is what we think it is, and if so, cast that object to the type we really want and do something with the new object of the proper type. To do that, we might write some code like this:

if(someObject instanceof String) {
    String someString = (String) someObject;
    doSomethingWithString(someString);
}

This is a common use case, and Java 17 comes with a feature called Pattern Matching for instanceof to help make it a bit cleaner:

if(someObject instanceof String someString) {
    doSomethingWithString(someString);
}

When using instanceof we can declare a new local variable (someString) after the type (String). The local variable is limited in scope to the if statement here.

For more details on this, please check out my dedicated post on Pattern Matching for instanceof

RandomGenerator and RandomGeneratorFactory

Java’s Random class which dates all the way back to Java 1.0, has been used for quite a long time to generate random numbers. We can still use Random, but there is a newer, better way to get access to random number generators in Java 17. There is a new interface called RandomGenerator , which is now implemented by all of the existing random number generators (Random, SecureRandom, ThreadLocalRandom and SplittableRandom). This change also introduces the concept of a platform-default random source.

For historical reasons, we can still write code like this:

// Traditional way
final Random random = new Random();

But now that we have RandomGenerator, we should use that, and have it pick the platform default random source for us:

// New way
final RandomGenerator random = RandomGenerator.getDefault();

Note that the default generator returned from RandomGenerator.getDefault() is different from the implementation in the Random class, which is now considered a legacy implementation. While the old way still works, it’s a good idea to move to the new way unless you have something that really depends on how numbers are generated using Random. Using SecureRandom is still recommended for generating cryptographically secure numbers. Java 17 comes with several different random algorithms, and we can use them all by name if we want:

final RandomGenerator random = RandomGenerator.of("L128X1024MixRandom");

For a full list of algorithms and a description for each one, consult the Javadoc for RandomGenerator , it provides a complete list.

Another new feature in Java 17 is RandomGeneratorFactory, a class for finding RandomGenerators. Why bother with that? Because now we can pick a RandonGenerator based on its properties instead of using the default or knowing the name of one of the algorithms. For example, if we wanted to pick the RandomGenerator with the highest number of state bits that is not deprecated, we could write this code to find it:

final RandomGenerator random = RandomGeneratorFactory
        .all()
        .sorted(Comparator.comparingInt(RandomGeneratorFactory<RandomGenerator>::stateBits).reversed())
        .filter(it -> !it.isDeprecated())
        .findFirst()
        .orElse(RandomGeneratorFactory.getDefault())
        .create();

I have borrowed this example largely from the Javadoc for RandomGeneratorFactory , which also contains a full list of properties we can filter on and their meanings.

Finally, is possible to write our own implementations of RandomGenerator, but I’m struggling to think of why a day-to-day Java developer would need to do this . Perhaps this is something that would ship with a hardware random device, or perhaps one of the JDK vendors would choose to include a new set of random algorithms.

Sealed Types

Sometimes when we write software, it would be nice to restrict a class hierarchy in order to limit the implementations to only those we anticipate. Up until now, Java hasn’t had an elegant solution to this. Sometimes, we can get away with using an enum and using its ability to define a finite set of values. But this doesn’t always work. What if we wanted to ship a library and limit our consumer’s ability to implement our interfaces? Or maybe if we wanted to carry some state in our classes (making enums a bad choice), but limit the types of classes?

Enter sealed types. With sealed types, we can define interfaces or classes and tell Java to only permit specific implementations or subclasses.

For example:


sealed interface Message permits Start, Stop, Timeout {}

record Start(String serverName) implements Message {}

record Stop(String reason) implements Message {}

sealed class Timeout implements Message {
    public Timeout(Duration duration) {
        // ...
    }
}

Our Message interface only allows specific classes to implement it - Start, Stop and Timeout. If we were to define another class and have it implement Message, the compiler would not allow this. In this specific case we’re using records to implement our Start and Stop, but a regular class to implement Timeout. Either will work, I’m just doing this to illustrate the fact that we have options here.

In the case of Timeout, we are explicitly marking it as sealed so that it cannot have any sublcasses (we could also mark it as final). It’s important to realize that Timeout itself could also permit its own subclasses if we wanted, allowing us to have a full tree of classes.

What if we wanted to have a sealed hierarchy, but leave open the possibility that one single type is allowed to have whatever subclasses it wants? For example, if we have our own library with our own internal implementations of an interface, but wanted to permit our clients to subclass a base class, but only that base class. Java has you covered with the non-sealed keyword. This tells Java that this class participates in the sealed type hierarchy, but it is not itself sealed. For example…


non-sealed class Timeout implements Message {
    public Timeout(Duration duration) {
        // ...
    }
}

In this case, we’ve modified our Timeout class to be non-sealed, so anything that subclasses it will be allowed.

I’ll admit I haven’t done a lot with sealed classes in Java yet, but I do like the flexibility they give me when I write in Kotlin (which has its own sealed type system). I’m eager to see what the Java community comes up with for sealed types, I’m hoping this lets us have more control and flexibility at the same time, and gets us away from using enums where they aren’t really suited.

Static Factory Methods to Create Immutable Collections

Before now, initializing collections in Java has been a bit more work than in other languages.

With these new methods, we can easily create immutable sets:

final Set<String> names = Set.of("Todd", "Anna", "Emma");

Or lists:

final List<String> names = List.of("Todd", "Anna", "Emma");

Or maps:

final Map<String, Color> favoriteColors = Map.of(
    "Todd", Color.Green, 
    "Anna", Color.Blue, 
    "Emma", Color.Red
);

Note that for maps with 11 or more entries, we’ll need to use Map.ofEntries:

import static java.util.Map.entry;

final Map<String, Color> favoriteColors = Map.ofEntries(
    entry("Todd", Color.Green),
    entry("Anna", Color.Blue),
    entry("Emma", Color.Red)
);

If it wasn’t clear from the title, these collections are immutable, meaning they can’t be altered once created. I love this feature because all too often I’ll see developers write utility code to do this or worse yet, pull in an entire library just to do this kind of work (I have definite opinions on that practice! ).

Text Blocks

Have you ever had to define some JSON or SQL as a String in Java? If so, you’ve probably written some code like this:

String someJson = "{\n" +
        "   \"name\": {\n" +
        "     \"first\": \"Todd\",\n" +
        "     \"last\": \"Ginsberg\"\n" +
        "   },\n" +
        "   \"points\": 42\n" +
        "}";

It’s messy having to concatenate substrings together and escape quotes and newlines. It’s more work than it has to be. Thankfully, Text Blocks are here to make life easier! With Text Blocks, we can enclose a multiline string with three quotes (""") and we’re done.

String someJson = """
        {
           "name": {
             "first": "Todd",
             "last": "Ginsberg"
           },
           "points": 42
        }
        """;

There are a few things to know about Text Blocks before using them. The first set of three quotes (""") must be on its own line, but the ending quotes do not (this helps if you don’t want a newline character at the end of your String). The next thing to be aware of is how leading and trailing whitespace is handled. All common leading whitespace is trimmed off. In practical terms, this means Java calculates how much whitespace each line has and trims all of the lines in the Text Block by the minimum number. Additionally, all trailing whitespace (except the newline) is trimmed off from each line.

This is very useful for SQL queries, JSON, (perhaps in tests?) or hard coded text.

In Conclusion…

If you are moving from Java 8 to Java 17, there are hundreds of features, so I’ve only covered my ten most favorite. I’d love to hear which feature you like the best in Java 17, drop me a line (contact info below)! I hope you found this helpful.

TL;DR: