We should teach fractions by letting the whole truth hang out. We teach that fractions were developed before Euclid to deal with practical problems. These are similar to what is taught in school.
Then Euclid tried to come up with a concept of natural numbers, proportions and magnitudes. The magnitudes were to cover lengths like hypotenuse of a right triangle with legs equal to 1, or the circumference of a circle whose diameter was one.
The Euclid approach was difficult to understand and eventually was abandoned in favor of developing new symbols of math in the 15th to 17th centuries. This resulted in the algebraic rules for fractions we have today. Those are in Euler Elements of Algebra 1765.
Then in the 19th century, mathematicians came up with new concepts of natural number, rational number and real number to replace the Euclid concepts and to justify the rules with the new symbols.
Natural numbers were defined as a sequence of add one starting from zero. Rules were given including the more difficult rule of mathematical induction to do proofs and to define addition and multiplication of natural numbers.
Ordered pairs of natural numbers were defined with rules to manipulate them. A rule for equivalence of ordered pairs of natural numbers was given.
Real numbers were defined as infinite decimals which are equivalent to an infinite sequence of rational numbers, e.g. the one where you terminate the decimal at each place as an element of the sequence. Rules are given for these as well.
In addition to everyday human requirements for rational numbers, mathematical requirements were added as part of the 19th century definition of ordered pairs with procedures. These are requirements to make it easier to do math.
For example, we want that
1/2 * 2 * x = x
Dividing both sides by 1/2, we get
2*x = x /(1/2)
So we want dividing by 1/2 to equal multiplying by 2. So we define division by a fraction as multiplying by the fraction with numerator and denominator switched. Mathematicians have proven this rule works consistently.
Starting from intuitive ideas, people developed the symbolic methods of the 15to 17th century. Those were then codified into the object oriented number concept of rational number in the 19th century.
This is a typical progression for humans. Something is started, it is added to, and eventually someone figures out a way to replicate it with a simple concept. In this case, prior rules could be replicated exactly. In situations outside math and hard sciences, the replication in codification is only approximate.
What we had before codification is a set of requirements on the codification. If we are lucky, all the requirements can be met. If we are not, we have to pick and choose what we will match or match closely and what we give up.
19th century rational numbers didn’t require us to give up anything from the rules used before the modern codification of ordered pairs with procedures.
There is still some looseness in expression such as calling 1/2 and 2/4 rational numbers instead of equivalent representatives of a rational number. This is harmless however and is useful for brevity.