diff --git a/docs/_docs/reference/changed-features/wildcards.md b/docs/_docs/reference/changed-features/wildcards.md index 0d3e13c3d7e0..ac7235770e36 100644 --- a/docs/_docs/reference/changed-features/wildcards.md +++ b/docs/_docs/reference/changed-features/wildcards.md @@ -4,7 +4,7 @@ title: Wildcard Arguments in Types nightlyOf: https://docs.scala-lang.org/scala3/reference/changed-features/wildcards.html --- -The syntax of wildcard arguments in types has changed from `_` to `?`. Example: +The syntax of wildcard arguments in types is changing from `_` to `?`. Example: ```scala List[?] Map[? <: AnyRef, ? >: Null] @@ -14,8 +14,8 @@ Map[? <: AnyRef, ? >: Null] We would like to use the underscore syntax `_` to stand for an anonymous type parameter, aligning it with its meaning in value parameter lists. So, just as `f(_)` is a shorthand for the lambda `x => f(x)`, in the future `C[_]` will be a shorthand -for the type lambda `[X] =>> C[X]`. This makes higher-kinded types easier to use. It also removes the wart that, used as a type -parameter, `F[_]` means `F` is a type constructor whereas used as a type, `F[_]` means it is a wildcard (i.e. existential) type. +for the type lambda `[X] =>> C[X]`. This will make higher-kinded types easier to use. It will also remove the wart that, used as a type +parameter, `F[_]` means `F` is a type constructor, whereas used as a type, `F[_]` means it is a wildcard (i.e. existential) type. In the future, `F[_]` will mean the same thing, no matter where it is used. We pick `?` as a replacement syntax for wildcard types, since it aligns with @@ -28,11 +28,11 @@ compiler plugin still uses the reverse convention, with `?` meaning parameter pl A step-by-step migration is made possible with the following measures: - 1. In Scala 3.0, both `_` and `?` are legal names for wildcards. - 2. In Scala 3.1, `_` is deprecated in favor of `?` as a name for a wildcard. A `-rewrite` option is + 1. In earlier versions of Scala 3, both `_` and `?` are legal names for wildcards. + 2. In Scala 3.4, `_` will be deprecated in favor of `?` as a name for wildcards. A `-rewrite` option is available to rewrite one to the other. - 3. In Scala 3.2, the meaning of `_` changes from wildcard to placeholder for type parameter. - 4. The Scala 3.1 behavior is already available today under the `-source future` setting. + 3. At some later point in the future, the meaning of `_` will change from wildcard to placeholder for type parameters. + 4. Some deprecation warnings are already available under the `-source future` setting. To smooth the transition for codebases that use kind-projector, we adopt the following measures under the command line option `-Ykind-projector`: @@ -42,7 +42,7 @@ option `-Ykind-projector`: available to rewrite one to the other. 3. In Scala 3.3, `*` is removed again, and all type parameter placeholders will be expressed with `_`. -These rules make it possible to cross build between Scala 2 using the kind projector plugin and Scala 3.0 - 3.2 using the compiler option `-Ykind-projector`. +These rules make it possible to cross-build between Scala 2 using the kind projector plugin and Scala 3.0 - 3.2 using the compiler option `-Ykind-projector`. There is also a migration path for users that want a one-time transition to syntax with `_` as a type parameter placeholder. With option `-Ykind-projector:underscores` Scala 3 will regard `_` as a type parameter placeholder, leaving `?` as the only syntax for wildcards. diff --git a/docs/_docs/reference/contextual/context-functions.md b/docs/_docs/reference/contextual/context-functions.md index 0ad3c8757782..0d174583f230 100644 --- a/docs/_docs/reference/contextual/context-functions.md +++ b/docs/_docs/reference/contextual/context-functions.md @@ -8,27 +8,29 @@ _Context functions_ are functions with (only) context parameters. Their types are _context function types_. Here is an example of a context function type: ```scala +import scala.concurrent.ExecutionContext + type Executable[T] = ExecutionContext ?=> T ``` Context functions are written using `?=>` as the "arrow" sign. They are applied to synthesized arguments, in the same way methods with context parameters are applied. For instance: ```scala - given ec: ExecutionContext = ... +given ec: ExecutionContext = ... - def f(x: Int): ExecutionContext ?=> Int = ... +def f(x: Int): ExecutionContext ?=> Int = ... - // could be written as follows with the type alias from above - // def f(x: Int): Executable[Int] = ... +// could be written as follows with the type alias from above +// def f(x: Int): Executable[Int] = ... - f(2)(using ec) // explicit argument - f(2) // argument is inferred +f(2)(using ec) // explicit argument +f(2) // argument is inferred ``` Conversely, if the expected type of an expression `E` is a context function type `(T_1, ..., T_n) ?=> U` and `E` is not already an context function literal, `E` is converted to a context function literal by rewriting it to ```scala - (x_1: T1, ..., x_n: Tn) ?=> E +(x_1: T1, ..., x_n: Tn) ?=> E ``` where the names `x_1`, ..., `x_n` are arbitrary. This expansion is performed before the expression `E` is typechecked, which means that `x_1`, ..., `x_n` @@ -38,14 +40,14 @@ Like their types, context function literals are written using `?=>` as the arrow For example, continuing with the previous definitions, ```scala - def g(arg: Executable[Int]) = ... +def g(arg: Executable[Int]) = ... - g(22) // is expanded to g((ev: ExecutionContext) ?=> 22) +g(22) // is expanded to g((ev: ExecutionContext) ?=> 22) - g(f(2)) // is expanded to g((ev: ExecutionContext) ?=> f(2)(using ev)) +g(f(2)) // is expanded to g((ev: ExecutionContext) ?=> f(2)(using ev)) - g((ctx: ExecutionContext) ?=> f(3)) // is expanded to g((ctx: ExecutionContext) ?=> f(3)(using ctx)) - g((ctx: ExecutionContext) ?=> f(3)(using ctx)) // is left as it is +g((ctx: ExecutionContext) ?=> f(3)) // is expanded to g((ctx: ExecutionContext) ?=> f(3)(using ctx)) +g((ctx: ExecutionContext) ?=> f(3)(using ctx)) // is left as it is ``` ## Example: Builder Pattern @@ -54,63 +56,65 @@ Context function types have considerable expressive power. For instance, here is how they can support the "builder pattern", where the aim is to construct tables like this: ```scala - table { - row { - cell("top left") - cell("top right") - } - row { - cell("bottom left") - cell("bottom right") - } +table { + row { + cell("top left") + cell("top right") + } + row { + cell("bottom left") + cell("bottom right") } +} ``` The idea is to define classes for `Table` and `Row` that allow the addition of elements via `add`: ```scala - class Table: - val rows = new ArrayBuffer[Row] - def add(r: Row): Unit = rows += r - override def toString = rows.mkString("Table(", ", ", ")") +import scala.collection.mutable.ArrayBuffer + +class Table: + val rows = new ArrayBuffer[Row] + def add(r: Row): Unit = rows += r + override def toString = rows.mkString("Table(", ", ", ")") - class Row: - val cells = new ArrayBuffer[Cell] - def add(c: Cell): Unit = cells += c - override def toString = cells.mkString("Row(", ", ", ")") +class Row: + val cells = new ArrayBuffer[Cell] + def add(c: Cell): Unit = cells += c + override def toString = cells.mkString("Row(", ", ", ")") - case class Cell(elem: String) +case class Cell(elem: String) ``` Then, the `table`, `row` and `cell` constructor methods can be defined with context function types as parameters to avoid the plumbing boilerplate that would otherwise be necessary. ```scala - def table(init: Table ?=> Unit) = - given t: Table = Table() - init - t - - def row(init: Row ?=> Unit)(using t: Table) = - given r: Row = Row() - init - t.add(r) - - def cell(str: String)(using r: Row) = - r.add(new Cell(str)) +def table(init: Table ?=> Unit) = + given t: Table = Table() + init + t + +def row(init: Row ?=> Unit)(using t: Table) = + given r: Row = Row() + init + t.add(r) + +def cell(str: String)(using r: Row) = + r.add(new Cell(str)) ``` With that setup, the table construction code above compiles and expands to: ```scala - table { ($t: Table) ?=> - - row { ($r: Row) ?=> - cell("top left")(using $r) - cell("top right")(using $r) - }(using $t) - - row { ($r: Row) ?=> - cell("bottom left")(using $r) - cell("bottom right")(using $r) - }(using $t) - } +table { ($t: Table) ?=> + + row { ($r: Row) ?=> + cell("top left")(using $r) + cell("top right")(using $r) + }(using $t) + + row { ($r: Row) ?=> + cell("bottom left")(using $r) + cell("bottom right")(using $r) + }(using $t) +} ``` ## Example: Postconditions @@ -131,12 +135,18 @@ import PostConditions.{ensuring, result} val s = List(1, 2, 3).sum.ensuring(result == 6) ``` -**Explanations**: We use a context function type `WrappedResult[T] ?=> Boolean` +### Explanation + +We use a context function type `WrappedResult[T] ?=> Boolean` as the type of the condition of `ensuring`. An argument to `ensuring` such as `(result == 6)` will therefore have a given of type `WrappedResult[T]` in -scope to pass along to the `result` method. `WrappedResult` is a fresh type, to make sure +scope to pass along to the `result` method. + +`WrappedResult` is a fresh type, to make sure that we do not get unwanted givens in scope (this is good practice in all cases -where context parameters are involved). Since `WrappedResult` is an opaque type alias, its +where context parameters are involved). + +Since `WrappedResult` is an opaque type alias, its values need not be boxed, and since `ensuring` is added as an extension method, its argument does not need boxing either. Hence, the implementation of `ensuring` is close in efficiency to the best possible code one could write by hand: diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 66d0cf3fdf38..ed0e005c1bd4 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -104,7 +104,7 @@ given TC[DerivingType] = TC.derived // simplified form of: given TC[ [A_1, ..., A_K] =>> DerivingType[A_1, ..., A_K] ] = TC.derived ``` -If `DerivingType` takes less arguments than `F` (`N < K`), we use only the rightmost parameters from the type lambda: +If `DerivingType` takes fewer arguments than `F` (`N < K`), we use only the rightmost parameters from the type lambda: ```scala given TC[ [A_1, ..., A_K] =>> DerivingType[A_(K-N+1), ..., A_K] ] = TC.derived @@ -112,7 +112,7 @@ given TC[ [A_1, ..., A_K] =>> DerivingType[A_(K-N+1), ..., A_K] ] = TC.derived given TC[ [A_1, ..., A_K] =>> DerivingType ] = TC.derived ``` -If `F` takes less arguments than `DerivingType` (`K < N`), we fill in the remaining leftmost slots with type parameters of the given: +If `F` takes fewer arguments than `DerivingType` (`K < N`), we fill in the remaining leftmost slots with type parameters of the given: ```scala given [T_1, ... T_(N-K)]: TC[[A_1, ..., A_K] =>> DerivingType[T_1, ... T_(N-K), A_1, ..., A_K]] = TC.derived ``` @@ -158,7 +158,7 @@ of the `Mirror` type class available. ## `Mirror` `scala.deriving.Mirror` type class instances provide information at the type level about the components and labelling of the type. -They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive +They also provide minimal term-level infrastructure to allow higher-level libraries to provide comprehensive derivation support. Instances of the `Mirror` type class are generated automatically by the compiler @@ -269,14 +269,14 @@ No given instance of type deriving.Mirror.Of[A] was found for parameter x of met Note the following properties of `Mirror` types, + Properties are encoded using types rather than terms. This means that they have no runtime footprint unless used and - also that they are a compile time feature for use with Scala 3's metaprogramming facilities. + also that they are a compile-time feature for use with Scala 3's metaprogramming facilities. + There is no restriction against the mirrored type being a local or inner class. + The kinds of `MirroredType` and `MirroredElemTypes` match the kind of the data type the mirror is an instance for. This allows `Mirror`s to support ADTs of all kinds. + There is no distinct representation type for sums or products (ie. there is no `HList` or `Coproduct` type as in Scala 2 versions of Shapeless). Instead the collection of child types of a data type is represented by an ordinary, possibly parameterized, tuple type. Scala 3's metaprogramming facilities can be used to work with these tuple types - as-is, and higher level libraries can be built on top of them. + as-is, and higher-level libraries can be built on top of them. + For both product and sum types, the elements of `MirroredElemTypes` are arranged in definition order (i.e. `Branch[T]` precedes `Leaf[T]` in `MirroredElemTypes` for `Tree` because `Branch` is defined before `Leaf` in the source file). This means that `Mirror.Sum` differs in this respect from Shapeless's generic representation for ADTs in Scala 2, @@ -303,16 +303,16 @@ has a context `Mirror` parameter, or not at all (e.g. they might use some comple instance using Scala 3 macros or runtime reflection). We expect that (direct or indirect) `Mirror` based implementations will be the most common and that is what this document emphasises. -Type class authors will most likely use higher level derivation or generic programming libraries to implement -`derived` methods. An example of how a `derived` method might be implemented using _only_ the low level facilities +Type class authors will most likely use higher-level derivation or generic programming libraries to implement +`derived` methods. An example of how a `derived` method might be implemented using _only_ the low-level facilities described above and Scala 3's general metaprogramming features is provided below. It is not anticipated that type class authors would normally implement a `derived` method in this way, however this walkthrough can be taken as a guide for -authors of the higher level derivation libraries that we expect typical type class authors will use (for a fully +authors of the higher-level derivation libraries that we expect typical type class authors will use (for a fully worked out example of such a library, see [Shapeless 3](https://github.com/milessabin/shapeless/tree/shapeless-3)). -## How to write a type class `derived` method using low level mechanisms +## How to write a type class `derived` method using low-level mechanisms -The low-level method we will use to implement a type class `derived` method in this example exploits three new type-level constructs in Scala 3: inline methods, inline matches, and implicit searches via `summonInline` or `summonFrom`. +The low-level technique we will use to implement a type class `derived` method in this example exploits three new type-level constructs in Scala 3: inline methods, inline matches, and implicit searches via `summonInline` or `summonFrom`. Given this definition of the `Eq` type class, ```scala @@ -335,13 +335,13 @@ inline def derived[T](using m: Mirror.Of[T]): Eq[T] = ``` Note that `derived` is defined as an `inline def`. -This means that the method will be inlined at all call sites (for instance the compiler generated instance definitions in the companion objects of ADTs which have a `deriving Eq` clause). +This means that the method will be inlined at all call sites (for instance the compiler-generated instance definitions in the companion objects of ADTs which have a `deriving Eq` clause). > Inlining of complex code is potentially expensive if overused (meaning slower compile times) so we should be careful to limit how many times `derived` is called for the same type. -> For example, when computing an instance for a sum type, it may be necessary to call `derived` recursively to compute an instance for a one of its child cases. +> For example, when computing an instance for a sum type, it may be necessary to call `derived` recursively to compute an instance for each one of its child cases. > That child case may in turn be a product type, that declares a field referring back to the parent sum type. > To compute the instance for this field, we should not call `derived` recursively, but instead summon from the context. -> Typically the found given instance will be the root given instance that initially called `derived`. +> Typically, the found given instance will be the root given instance that initially called `derived`. The body of `derived` (1) first materializes the `Eq` instances for all the child types of type the instance is being derived for. This is either all the branches of a sum type or all the fields of a product type. @@ -380,7 +380,7 @@ def eqSum[T](s: Mirror.SumOf[T], elems: => List[Eq[?]]): Eq[T] = (s.ordinal(y) == ordx) && check(x, y, elems(ordx)) // (4) ``` -In the product case, `eqProduct` we test the runtime values of the arguments to `eqv` for equality as products based on the `Eq` instances for the fields of the data type (5), +In the product case, `eqProduct`, we test the runtime values of the arguments to `eqv` for equality as products based on the `Eq` instances for the fields of the data type (5), ```scala import scala.deriving.Mirror @@ -396,8 +396,9 @@ Both `eqSum` and `eqProduct` have a by-name parameter `elems`, because the argum Pulling this all together we have the following complete implementation, ```scala +import scala.collection.AbstractIterable +import scala.compiletime.{erasedValue, error, summonInline} import scala.deriving.* -import scala.compiletime.{error, erasedValue, summonInline} inline def summonInstances[T, Elems <: Tuple]: List[Eq[?]] = inline erasedValue[Elems] match @@ -486,7 +487,7 @@ Alternative approaches can be taken to the way that `derived` methods can be def inlined variants using Scala 3 macros, whilst being more involved for type class authors to write than the example above, can produce code for type classes like `Eq` which eliminate all the abstraction artefacts (eg. the `Lists` of child instances in the above) and generate code which is indistinguishable from what a programmer might write by hand. -As a third example, using a higher level library such as Shapeless the type class author could define an equivalent +As a third example, using a higher-level library such as Shapeless, the type class author could define an equivalent `derived` method as, ```scala @@ -508,7 +509,7 @@ inline def derived[A](using gen: K0.Generic[A]): Eq[A] = The framework described here enables all three of these approaches without mandating any of them. For a brief discussion on how to use macros to write a type class `derived` -method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). +method, please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). ## Syntax @@ -539,22 +540,22 @@ This type class derivation framework is intentionally very small and low-level. infrastructure in compiler-generated `Mirror` instances, + type members encoding properties of the mirrored types. -+ a minimal value level mechanism for working generically with terms of the mirrored types. ++ a minimal value-level mechanism for working generically with terms of the mirrored types. The `Mirror` infrastructure can be seen as an extension of the existing `Product` infrastructure for case classes: -typically `Mirror` types will be implemented by the ADTs companion object, hence the type members and the `ordinal` or +typically, `Mirror` types will be implemented by the ADTs companion object, hence the type members and the `ordinal` or `fromProduct` methods will be members of that object. The primary motivation for this design decision, and the decision to encode properties via types rather than terms was to keep the bytecode and runtime footprint of the feature small enough to make it possible to provide `Mirror` instances _unconditionally_. -Whilst `Mirrors` encode properties precisely via type members, the value level `ordinal` and `fromProduct` are +Whilst `Mirrors` encode properties precisely via type members, the value-level `ordinal` and `fromProduct` are somewhat weakly typed (because they are defined in terms of `MirroredMonoType`) just like the members of `Product`. This means that code for generic type classes has to ensure that type exploration and value selection proceed in lockstep and it has to assert this conformance in some places using casts. If generic type classes are correctly written these casts will never fail. -As mentioned, however, the compiler-provided mechanism is intentionally very low level and it is anticipated that -higher level type class derivation and generic programming libraries will build on this and Scala 3's other +As mentioned, however, the compiler-provided mechanism is intentionally very low-level and it is anticipated that +higher-level type class derivation and generic programming libraries will build on this and Scala 3's other metaprogramming facilities to hide these low-level details from type class authors and general users. Type class derivation in the style of both Shapeless and Magnolia are possible (a prototype of Shapeless 3, which combines aspects of both Shapeless 2 and Magnolia has been developed alongside this language feature) as is a more aggressively diff --git a/docs/_docs/reference/dropped-features/package-objects.md b/docs/_docs/reference/dropped-features/package-objects.md index 469ec6ac5134..9fe5bbd2de41 100644 --- a/docs/_docs/reference/dropped-features/package-objects.md +++ b/docs/_docs/reference/dropped-features/package-objects.md @@ -11,7 +11,7 @@ package object p { def b = ... } ``` -will be dropped. They are still available, but will be deprecated and removed afterwards. +will be dropped. They are still available, but will be deprecated and removed at some point in the future. Package objects are no longer needed since all kinds of definitions can now be written at the top-level. Example: ```scala diff --git a/docs/_docs/reference/experimental/cc.md b/docs/_docs/reference/experimental/cc.md index b1b2fa8e80cc..5bdf91f628ec 100644 --- a/docs/_docs/reference/experimental/cc.md +++ b/docs/_docs/reference/experimental/cc.md @@ -8,7 +8,8 @@ Capture checking is a research project that modifies the Scala type system to tr ```scala import language.experimental.captureChecking ``` -At present, capture checking is still highly experimental and unstable. +At present, capture checking is still highly experimental and unstable, and it evolves quickly. +Before trying it out, make sure you have the latest version of Scala. To get an idea what capture checking can do, let's start with a small example: ```scala