Metrics are valuable for a software company to assess the quality of their software in several aspects. However, selecting suitable metrics to gain the appropriate insight is not a trivial matter. Even though many frameworks for selecting metrics exist, the metric selection process is usually done by humans which is prone to error and subjectiveness when selecting the metrics. As there are already several existing frameworks for selecting metrics, I compare those frameworks and choose one of them as the guideline for my concept of a metrics selection tool that guides its users as well as minimizing human error and subjectiveness during the metrics selection process. The tool is also able to accommodate inputs from various users regarding the metrics selection of a project and offers the possibility to solve the diversities. Furthermore, the proposed tool is also able to interpret metrics and perform a simulation with respect to the selected framework. To prove my concept, I present an implementation of my concept. Finally, I evaluate the performance of the tool for the metrics selection process and the simulation.