**3. Titanium based alloys**

Titanium and titanium-based alloys have been available since the 18th century and are widely used today as biomaterials. They were first applied as a biomaterial in the 1940s in dentistry. However, their mechanical properties and excellent biocompatibility led to

**Figure 2.** *Exeter V40 stainless steel femoral stem.*

#### *Biomaterials in Total Joint Arthroplasty DOI: http://dx.doi.org/10.5772/intechopen.107509*

them becoming a desirable orthopaedic alloy. Titanium has a low density, high tensile strength and is highly corrosion resistant. The presence of chromium in titanium results in the in vivo formation of a passive oxide layer producing good corrosion resistance. Pure titanium is available in various grades, with the relative oxygen content determining the degree of impurity. Titanium alloy exists as a biphasic structure. As it precipitates from its molten state, an alpha phase results in a HCP arrangement and the beta phase produces a BCC arrangement. This biphasic precipitate results in improved fatigue resistance.

Titanium is commonly alloyed with aluminium, vanadium, niobium, zirconium and tantalum. The most common titanium alloy used in orthopaedics is Ti-6AL-4 V, often termed Titanium 64 due to the 6% aluminium and 4% vanadium concentrations. This alloy possesses a higher ultimate tensile strength than pure titanium and a modulus of elasticity closer to that of bone compared to stainless steel, preventing stress shielding. Additionally, newer generation titanium alloys (TiMoFe, TiMoNbZr and TiNbZrTaSiFe) demonstrate increased elasticity which may improve this capability further [18–20]. Another, distinct advantage of titanium alloys is MRI compatibility, given that it is non-dielectric and does not rise in temperature when placed in a magnetic field. These properties make titanium an ideal material as an orthopaedic implant. The main disadvantage of titanium, however, is titanium's poor abrasion resistance and notch sensitivity. Accordingly, titanium is not suitable as a bearing material and should be handled meticulously intraoperatively.

#### **3.1 Cobalt chrome alloy**

Cobalt-Chrome (Co-Cr) alloy was introduced into total joint arthroplasty in the early 1900s, as a modification of Vitallium, a common alloy which was in use in dentistry at the time [21]. Most Co-Cr orthopaedic implants contain Cobalt (62–68%), Chromium (27–30%), Molybdenum (5–7%), and Nickel (<2.5%). The alloy was initially used by Smith-Petersen in 1939 in mould arthroplasty and later in the Charnley femoral stem following a move from stainless- steel [9, 22]. Co-Cr possesses several properties which make it a highly suitable alloy for use in arthroplasty. The presence of Cr, as with other alloys, results in the formation of a passive oxide layer providing protection against corrosion, and as a result excellent biocompatibility. Co-Cr alloy has among the highest modulus of elasticity among all commonly used arthroplasty materials. It also possesses a high ultimate tensile strength and has excellent wear resistance.

Modern techniques for implant production use powder metallurgy to reduce the carbon content and thus limit carbide phases which negatively impacts Co-Cr mechanical characteristics. Previous techniques involved cast-wrought production which resulted in increased carbide formation. In contrast, powder metallurgy, involves sieving a fine powder alloy, heating it to a temperature just below melting point before compressing the alloy components in a die cast of the final component shape. Compared to cast-wrought production this method reduces grain size and carbide formation, improving its strength and corrosion resistance.
