Blocking CSS and JS with a Disallow directive in robots.txt file is not recommended because Google will not be able to render your pages properly.
Google of these days is close to being human in understanding page structure. Therefore, assume that you have blocked JS and CSS from a human view, how will the page look like? Broken, right?
In extreme cases, you will receive an error message in Search Console that your pages cannot be properly rendered because you have blocked resources.
Here is a video of Matt Cutts from Google Search Central explaining why you should not block Googlebot from Crawling your JavaScript and CSS.