What are the performance costs of introducing a CLR type (class or interface)? [closed]-Collection of common programming errors
Even though .Net allows dynamic invocation (e.g. with reflection, C# dynamic keyword), but when using a language such as C# we sometimes feel it is necessary to use static typing, in order to prove that our program is correct, and will not have typing issues at runtime.
Sometimes this results in us introducing interfaces or base classes that fee like they are just for purpose of explaining to the compiler that ‘Yes, I know all the objects I pass to this context are going to be understand invoke Method X with arg Y – here, I will prove it to you using an interface definition!’ (For example – .net internally uses IReadChunkBytes interface to allow passing either SteamReadChunkBytes or BufferReadChunkBytes objects to some method or other.)
Other times we create classes or types to serve other purposes which are do not feel very usefully type-y, such as being unique identifiers (a bit like enums) with small attached behavior, or to hold a set of constants, etc.
I’m interested in better understanding what the compiletime, runtime, and other costs are going to be when I face such design decisions where I am asking ‘should I define a new type or interface just in order to solve this problem?’ Obviously there will be two sides to the cost and benefit in each such comparison, but in general we should hopefully see the same costs for ‘define new type’ in each such comparison/disucssion. How do we quantify these costs?