How to configure dynamic allocation for PySpark in Jupyter

I was using Zeppelin for pyspark scripting. Now i want to start scripting in Jupyter. I am not able to configure the following dynamic allocations.

livy.spark.dynamicAllocation.enabled true
livy.spark.dynamicAllocation.minExecutors 6
livy.spark.shuffle.service.enabled true
livy.spark.executor.cores 4
livy.spark.executor.memory 4g

How can i do that?