Well it certainly depends on how the website is constructed and what you intend to do with it and how you expect it to work.
I'm going to speak generally and assume that you can use google and to find resources on your own OS.
To get a copy of a rendering of a website you must find a web content scraper. This can be made difficult by websites that run on platforms with a cookie based login token but it can be done. That would get you the html and css and hopefully maybe even fetch the linked scripts, as well as downloading and placing any linked images. However if this page was generated by a server using something like PHP. If that is the case there is no way to replicate the functionality.