We present an approach to model the deployment costs, including compute and IO costs, of Microservice-based applications deployed to a public cloud. Our model, which we dubbed CostHat, supports both, Microservices deployed on traditional IaaS or PaaS clouds, and services that make use of novel cloud programming paradigms, such as AWS Lambda. CostHat is based on a network model, and allows for what-if and cost sensitivity analysis. Further, we have used this model to implement tooling that warns cloud developers directly in the Integrated Development Environment (IDE) about certain classes of potentially costly code changes. We illustrate our work based on a case study, and evaluate the CostHat model using a standalone Python implementation. We show that, once instantiated, cost calculation in CostHat is computationally inexpensive on standard hardware (below 1 ms even for applications consisting of thousand services and endpoints). This enables its use in real-time for developer tooling which continually re-evaluates the costs of an application in the background, while the developer is working on the code.