All data types should be defined in terms of GADT, and left to the compiler to figure out how to compile them.
Idris currently defines these constants:
data Constant = I Int | BI Integer | Str String | Ch Char | Db Double | WorldVal | IntType | IntegerType | StringType | CharType | DoubleType | WorldType
Someday, you'll be able to get rid of all the compile-time-known-size constants, and instead encode them in data types, leaving the compiler to figure it out:
Integer : Type Integer = Vect 64 Bool Double : Type Double = Vect 64 Bool Char : Type Char = Vect 8 Bool
Primitives will, at worst, look like this:
data Constant = Str String | WorldVal | StringType | WorldType
It looks crazy, but it's what proof engines like Coq and Agda do. Given the SAT solvers and superoptimizers of today, this is the only sane way to represent size-known constants. Primitives are getting primitive.
But this would increase compile times!
A superoptimizing backend would already be doing this, breaking down primitives into their true representation, and pushing functions one by one through a solver. This would, if anything, lessen compile times.
But what if we don't want superoptimization?
For sure, one day superoptimization will be done so well that it will be the norm, and no one will want to compile without them. They will be able to cache optimizations on disk, blazing through huge codebases hot-knife-though-butter style.
But what about taking advantage of special CPU instructions?
The only sane way to deal with special CPU instructions is to just let the compiler handle it, and the easiest way to do that is to have a literal representation of the type in code.
Idris is a beautiful language, and it's the closest thing we have to a safe, fun, and well-engineered language. It's the only general purpose functional language I care to bat an eye with, as dependent types + linear types are an absolute must. The future is full of formal verification and superoptimization, and to facilitate this future, primitive types may no longer be the way to go.
It is ridiculous to think that, even with superoptimizers, that Idris could compete with C for general purpose software. Idris inherently deals with garbage collection, and I've accepted at this point that functional languages simply don't work without it. Nonetheless, I've been impressed by MirageOS, MLton, Souper, and many others, and it all makes me feel very giddy inside.
Idris will never surpass C anywhere in the software stack. No matter what embedded DSL's we come up with, there's just no replacing that huge amount of software already written. C is safe, performant, and minimal.
But C's always been look don't touch for me. It works, but I couldn't begin to imagine how to build and maintain software in it. Maybe as I get a job, work in technology, I'll be forced to deal with such code, and I'll pick it up.
But for now, I can read Idris. And I can write it too. And it's way more fun than C.