PySpark Structured Streaming updating state

I'm looking at process a stream of data using Structured Streaming in PySpark and not sure how to achieve what I'm looking for, possible I'm just missing something or that it is not implemented in PySpark.

The data will be events that a user takes over a session (each with a session_id to join the events together) which may last for 30 minutes or so. As the events come in I want to build up a state for that user, maybe how many pages they visited, how far they got down to funnel, if they've purchased anything etc etc.

It is my understanding that in Scala you could do the following to build up a state.

However in PySpark I do not see the equivalent, all I've managed to be able to perform is groupby on the incoming events.

(
events
.groupBy(
'session_id'
).agg({ ... })
)

I've not managed to find much in the way of examples or documentation on the PySpark side which makes me feel that it is not possible. However any advice as to how to have stateful streaming in PySpark would be helpful.