Daml Connect 1.15.0 has been released as stable on Wednesday, July 14th. You can install it using:
daml install 1.15.0
Impact and Migration
--target=1.14
.com.daml.ledger.javaapi.data.Record
to com.daml.ledger.javaapi.data.DamlRecord
everywhere.Daml Exceptions in now Stable and is introduced in new Daml Ledger API and Daml-LF version:
Background
Daml, like most smart contract or transactional languages, has all-or-nothing semantics by default. A transaction either executes atomically as a whole, or not at all. This is key for security in a multi-party context, but requires careful handling of expected business exceptions. The new try/catch exception handling feature in Daml makes this a whole lot easier without compromising safety.
Developers can now wrap entire subtransactions in a try/catch block. Should a handleable exception be encountered in the try
block, the partial transaction from the start of try
to the exception is rolled back and the exception can be processed in the catch
block.
These operations are fully validated by Daml Ledgers retaining the security and determinism guarantees of Daml transactions.
Specific Changes
exception
to define a new exception typetry
to start a subtransaction that may be rolled backcatch
to handle an exception from a try
blockthrow
in module DA.Exception
to throw an exception.GeneralError
, ArithmeticError
, PreconditionFailed
, and AssertionFailed
.Details on their use and an example can be found on the reference documentation page and a new chapter in the Introduction to Daml.
Impact and Migration
This is an additive change.
Background
It is important for the operators of Daml components to be able to observe and diagnose problems downstream consumers are experiencing, ranging from component outages to rejected transactions.
Work is ongoing to improve logging and monitoring of the JSON API Server and Ledger API Servers, and this Daml Connect release includes a number of changes.
Specific Changes
INFO
level. Please enable TRACE
logging for the logger `com.daml.platform.apiserver.services.transaction.ApiTransactionService`` to log the request data structures.INFO
level.context: {a=b, x=1, foo=bar, parties=[alice, bob]}
to:
context: {a: "b", x: 1, foo: "bar", parties: ["alice", "bob"]}
Similar to the Daml Driver for PostgreSQL, the JSON API now exposes metrics which can be used for monitoring. To enable those metrics, there are two new CI flags:
--metrics-reporter <value>
Start a metrics reporter. Must be one of "console", "csv:///PATH", "graphite://HOST[:PORT][/METRIC_PREFIX]", or "prometheus://HOST[:PORT]".
--metrics-reporting-interval <value>
Set metric reporting interval (defaults to 10s)
You can use the console reporter to see a list of available metrics and refer to Metrics.scala for brief descriptions. In summary, these are the available metrics:
Impact and Migration
The majority of these changes are additive, but there are a few changes to log levels and payload. If you are relying on certain log messages and/or parse them, you’ll need to adjust your client code accordingly.
module-prefixes
field from daml.yaml
which can be used to handle module name collisions between different DALFs.java.lang.Record
, com.daml.ledger.javaapi.data.Record
has been renamed to com.daml.ledger.javaapi.data.DamlRecord
. The old name has been used to denote a sub-type of the newly renamed one, so it can still be used, but it has been marked as deprecated. If you want to use DamlRecord
objects with existing code, note that since Record
is now a sub-type of DamlRecord
, methods that expect a Record
as a parameter or that use them as part of standard Java collections will need to be explicitly adapted to use DamlRecord
.DamlRecord
type wherever Record
was used before. Java code generated by earlier versions of Daml Connect will continue to work against newer bindings but you should expect deprecation warnings. On the contrary, code generated from this version on will not work with earlier versions of the bindings out of the box.daml json-api
instead of the standalone JAR.A lot of work is happening under the hood at the moment, improving the production-readiness and ease of operation of the entire Daml stack, and reduce the complexity of building robust client applications. As the result, the coming releases will bring improvements on:
In parallel, we are completing work on a few features: