I think 2.0 should be even more memory intensive and have a minimum processing threshold that has a large footprint process. By raising the minimum compute power required to hash we force botnets out of business by making it very noticeable to owners that their computer is being used. Also users of instances will have to hire out more expensive machines which may not give them ROI, i ROI.
with that in mind, let us always remember that there is no such thing as ASIC resistant algo, as long as the market has enough fiat supporting it and a growing requirement for more specialized machinery, one entity will venture to make one. So a more balanced approach should include a distribution plan that evens the curve in a manner beneficial to the community.