Saturday, 16 October 2021

Lupine Publishers | Factors Impinging On Lung Function Deficits and Respiratory Symptoms among Workers at Wood-Burning Earth Kilns

 Lupine Publishers | Journal of Respiratory & Skin Diseases


Abstract

Background: Documented evidence confirms that inhalation of toxic substances emitted during charcoal production is associated with lung function deficits and respiratory symptoms nonetheless; other factors could also give rise to the similar respiratory disorders or problems. This study was designed to ascertain the influence other impinging factors wield on the respiratory symptoms and lung function deficits among workers at wood-burning earth kilns.

Methodology: This was a cross-sectional analytic study conducted among workers exposed to wood smoke from wood-burning earth kilns in Southern Nigeria. A comprehensive sampling of all workers who willing consented to participate in this study was done since the workers were few. A modified Medical Research Council Questionnaire as well as a portable spirometer was the study instrument. Data analyzed with SPSS version 22. Association for categorical data was tested with chi-square while Student's t test was applied to estimate the difference between means. Significance level was pre-determined by a p value less than 0.05.

Results: The modal age group was 40-49 years (28.4%), about half (48%) of the respondents were burners and less than two- fifth (38.5%) were domestic biomass users. All indices were worse among workers with a history of asthma (p <0.05) where same indices but PEFR were higher among workers with a history of moulding blocks (p<0.05). However, for the three workers who cooked in their rooms only their mean FVC and FEV1/FVC were significantly higher than others. The association of duration of work with the prevalence respiratory symptoms was not significant (p > 0.05). Wood setters had the highest prevalence of chronic cough, wheeze, breathlessness and chest tightness, whereas the association of job description and prevalence of symptoms was not significant (p > 0.05).

Conclusion: History of asthma significantly and negatively impinged on lung function deficit among these workers. Duration of job amongst other factors did not influence the prevalence of respiratory symptoms. Pre-employment screening of workers for respiratory disorders may be a worthwhile venture to pursue in the long term.

Introduction

Biomass remains one of the most primeval sources of energy however there is uncertainty about sustaining its viability, particularly as rural-urban development trailing deforestation is becoming popularin developing countries [1]. This apparent developmental activity thrives on commercialization of forest wood reserves as it provides substantial financial proceeds for rural residents, without an immediate evaluation of the negative environmental impact that would invariably jeopardize their continued subsistence [2,3].This perceived threat to availability of wood fuel has lent itself to exploration for renewable energy such as solar and wind. Nevertheless, in sub-Saharan Africa the technical proficiency for harnessing such inexhaustible energy supplies is grossly undeveloped owing mainly to poverty therefore, biomass is vastly relied [4]. With a step-wise demand for energy sources in urban locales; firewood as the first step, all the way through charcoal, fossil fuel, and to electrical energy ranking the least demanded for among the poor [2]. Firewood is more popular among rural populations and charcoal, due to its virtual weightlessness and smokeless characteristics, tends to be available in metropolitan zones for those who cannot sustainably afford fossil fuel or electricity [5,6].

Charcoal is the manufactured goods that ensue when wood is subjected to high temperature and pressure especially in an air-tight enclosure [7]. The technological methods for making charcoal are diverse with varying costs, levels of required expertise and burning efficiency. Nonetheless, most charcoal production in southern Nigeria as in some other parts of Africa occurs in earth kilns. This device not only has the drawback of low combustion efficiency but also it is fraught with giving off wood smoke [8], a substance with recognized toxicity in the respiratory and other systems of the body [9,10]. Making charcoal with devices such as retort and mekkokilns which condense effluent gases can substantially augment energy recovery and make the environment safer as almost all toxic substances are consumed [7-11] Similar technologies have been operated in Ghana and Costa Rica and might considerably advance charcoal manufacture if utilized in our setting. Nonetheless, their use has high cost implications and is implausible to be effected by the charcoal producers in Nigeria. Documented evidence confirms that inhalation of toxic substances emitted during charcoal production is associated with lung function deficits and the emergence of respiratory symptoms nonetheless [12-14]; other factors such as exposure to wood dust, could also give rise to similar respiratory disorders or problems [15].

Work-related pulmonary diseases (occupational asthma, occupational rhinitis, chronic obstructive airway disease etc.) constitute the highest cause of work-related illnesses in the United States of America [16]. The burden is enormous and exerts a huge toll on government and individuals. The annual spending on asthma in the United States of America is over $17 billion. This includes indirect costs from loss of productive work days due to disease and death [17]. In Africa, asthma has been estimated to affect over 10% of the population, and work-related asthma accounts for about a quarter of all cases of adult asthma [18,19]. A report from Nigeria, valued the annual cost of follow-up for one asthmatic patient to be about $368.4.19 Lung function abnormalities and respiratory symptoms could be accentuated in workers at earth kiln sites who already have respiratory disorders. Thus, this study was designed to ascertain the influence other impinging factors such as domestic biomass, smoking, exposure to other air pollutants, indoor cooking, duration of work could exert on the occurrence of respiratory symptoms and lung function deficits among workers at wood-burning earth kilns. Outcomes of this study could provide a scientific basis for adjudicating apposite recommendations to forestalla deterioration of the respiratory health of workers already involved in occupations with a potential risk of respiratory dysfunction.

Methodology

Study Population

This was a cross-sectional analytic study conducted among workers exposed to wood smoke from wood-burning earth kilns in Southern Nigeria. A comprehensive sampling of all workers who willing consented to participate in this study was done since the workers were few.

Ethical Approval

Ethical approval was given by the Health Research Ethics committee of Delta state University Teaching. Workers' participation in this study was absolutely on volunteer basis, as none of them was coerced; and informed consent was given by each participant before being included in the study.

Procedure

A modified Medical Research Council (MRC) Questionnaire was applied to obtain information from workers about their respiratory symptoms and previous respiratory disorders, job description, duration of work, previous jobs and exposure to air pollutants [20]. A hand-held lung function measuring device manufactured by Micro Medical, Ltd., Kent, UK was employed in conformity with the American Thoracic Society and European Respiratory Society Joint Task Force Guidelines on Spirometry to measure lung function indices [21]. Workers were given a pep talk about the aim of spirometry and spirometric manoeuvres; the latter was demonstrated to them following which they had to be encouraged to exercise repeated trial manoeuvres. A satisfactory performance was taken to be one with a forceful and persistent expiration lasting a minimum of 6 seconds following an in-depth inspiration [21]. With three attempts the best reading recorded was recorded as an acceptable value.

Statistical Analysis

All collected data was synthesized before entry into SPSS version 22 for analysis. Categorical data was expressed in percentages and association tested with chi-square while continuous variable was displayed as means (with standard deviation) and Student’s t test was applied to estimate the difference between means. Significance level was pre-determined by a p value less than 0.05.

Results

Table 1: Socio-Demographic Characteristics.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

The modal age group was 40-49 years (28.4%), about half(48%) of the respondents were burners and less than two-fifth (38.5%) were domestic biomass users (Table 1). The mean FEV1of charcoal workers who cook in the room was 2.81 ± 0.34L, while that in those who do not cook in the room was 2.00±0.82L. The difference was significant (p = 0.039). The mean FEV1of charcoal workers with a history of asthma were 1.67±0.00L while that among those without a history of asthma was 2.00±0.82. The difference was significant (p<0.001). The mean FEV1of charcoal workers who had worked at a block industry were 2.58 ± 0.33L while in those who have not it was 1.97±0.83L. The difference was significant (p = 0.004).From the result the potential confounder that adversely affected FEV1 -was a history of asthma (Table 2). The mean FVC of charcoal workers who had worked at a block industry was 2.79±0.22L while in those who have not it was 2.37±0.95L. The difference was significant (p =0.003).FVC was not affected adversely by any potential confounder (Table 3).

Table 2: Charcoal workers' FEV1 (L) and Potential Confounders.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

Table 3: Charcoal workers' FVC and Potential Confounders.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

The mean FEV1/FVC ratio of charcoal workers who cook in the room was 89.00±1.73, while that in those who do not cook in the room was 83.13±12.04. The difference was significant (p=0.03). The mean FEV1/FVC ratio of charcoal workers with a history of asthma was 69.0±0.00 while that among those without a history of asthma was 83.69±11.55. The difference was significant (p<0.001). The mean FEV1/FVC of charcoal workers who had worked at a block industry was 89.00±1.54 while in those who have not it was 83.08±12.15. The difference was highly significant (p<0.001). The results show that a history of asthma negatively affected FEV1/FVC (Table 4). The mean PEFR of charcoal workers with a history of asthma was 196.00 ± 0.00L/min while that among those without a history of asthma was 237.17±98.65L/min. The difference was significant (p<0.001).This shows that a history of asthma reduces PEFR (Table 5).

Table 4: Charcoal workers' FEV1/FVC ratio and Potential Confounders.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

Table 5: Charcoal workers� PEFR and Potential Confounders.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

A higher 6 (42.9%) proportion of charcoal workers who had worked for 5-10 years had cough compared to 4 (28.6%) who had worked for less than 5 years. This difference was not statistically significant (p = 0.094). The association of duration of work with the prevalence respiratory symptoms was not significant (p > 0.05) (Table 6). Wood setters had the highest prevalence of chronic cough, wheeze, breathlessness and chest tightness, whereas the prevalence of productive cough and nasal discharge was highest among burners; association of job description and prevalence of symptoms was not significant (Table 7). The association between all variables and presence or absence of respiratory symptom was not significant (Table 8).

Table 6: Respiratory Symptoms among Charcoal workers and Duration of work.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

Table 7: Respiratory symptoms among charcoal workers by Job description.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

Table 8: Factors associated with presence of respiratory symptoms.

lupinepublishers-openaccess-journal-respiratory-skin-diseases

Discussion

The age distribution in this study indicates that the earth kiln workers were mainly a young and middle aged population, particularly as the modal age range representing over a quarter was in the middle age. However, females outnumbered males probably because they were steadier on the job unlike males who frequently changed jobs while seeking greener pastures [22]. Workers who cooked in their rooms had far better mean FEV1 and forced expiratory ratio than those who did not. This is an unexpected finding as indoor cooking is associated with abnormalities in lung function [23-25] notwithstanding, all three workers probably cooked with fossil fuel rather than biomass. On the other hand, the period and extent of contact with wood smoke among earth kiln workers probably exceeded exposure to impinging factors like biomass and tobacco smoke such that it made the influence of those pollutants inconsequential. Thus, there were no significant differences in lung function indices between users of tobacco and biomass and non-users.

Conversely, from the result ofthis study the factor that adversely affected lung function deficit was having a history of asthma; consequently all pulmonary indices were worse in workers with a history of asthma. Further, only among workers with asthma was the forced expiratory ratio less than 70%which substantiates the obstruction to air outflow seen in asthma and can be aggravated by occupational wood smoke exposure [26]. The foregoing suggests that, in the presence of asthma, exposure to pollutants from wood- burning earth kilns further worsens lung function. Therefore, it may be suggested that workers be screened for pre-existing respiratory disorders in order to exclude those with asthma and other lung abnormalities prior to being employed in charcoal production.

All lung function indices except PEFR were much better among charcoal workers who had worked at a block industry than others. Although, the reason for this observation is not distinctly obvious from this study, it could be opined that worker who previously moulded blocks were fewer and probably had a short exposure to dust from moulding blocks. In addition, coarse particles generated from construction-related activities like block moulding are less harmful than fine particles which are capable of reaching deep into the tiniest pulmonary bronchioles and parenchyma [27]. Despite the high prevalence of respiratory symptoms observed in this study, duration of work was not strongly associated with any respiratory symptom. This may suggest that most symptoms were acute and occurred more among those with shorter length of exposure.

In contrast, a study in Kebbi, Nigeria in which more than a quarter of workers had been exposed for over ten years reported that the association between duration of exposure, chest tightness and dyspnoea was significant [28]. Another study conducted in Iran also recorded significant association between work duration and symptoms. This major difference may not be unrelated to the higher proportion of workers with shorter duration of exposure to wood smoke in this study [29]. Even though the average duration of work was almost 8 years only about one-fifth of charcoal workers in this study had been involved in charcoal production for more than 10 years whereas more than half had worked for less than 5 years. This average duration of exposure recorded in this study is absolutely lower than 19.1 years reported from a previous study conducted in Brazil and 14.2 years from the Iranian study [29-31].

A probable reason for the above being that large scale commercial charcoal production is still less well established in this part of the world and workers tend to get involved on short-term basis as a means to get by rather than as a permanent job [22]. This study did not establish a remarkable association between specific types of job performed by workers and the prevalence of respiratory symptoms; nonetheless, the prevalence of most respiratory symptoms was highest among wood setters. From the preceding finding, it is likely that charcoal workers had similar levels of exposure to wood smoke and wood setters possibly had a longer duration of exposure to wood smoke.

Unlike the previous study in Iran where significant relationships were established between harmful pollutants and respiratory symptoms [29] no association was observed for most pollutants and presence of respiratory symptoms in this study. While this contrast is remarkable a plausible explanation is not far-fetched since exposure to pollutants other than wood smoke had already been discontinued for all the earth kiln workers in this study. Prospective cohort studies to assess baseline respiration function and the impact of subsequent exposure to toxic substances discharged from burning wood in earth kilns would be needed in future to highlight lung function deficit regardless of impinging factors.

Conclusion

History of asthma significantly and negatively impinged on lung function deficit among these workers; contrarily biomass exposure did not make any difference in their lung function. Unexpectedly, workers with histories of block moulding as well as cooking indoors separately had better lung function though their small number could have accounted for this difference. Duration of job or job amongst other factors did not influence the prevalence or presence of respiratory symptoms. Pre-employment screening of workers for respiratory disorders to exclude those with asthma and other lung abnormalities prior to being employed in charcoal production may be a worthwhile venture to pursue in the long term.

Read More About our Lupine Publishers Journal of Skin Diseases Please Click on Below Link:
https://lupine-respiratory-skin-and-diseases.blogspot.com/

Friday, 15 October 2021

Lupine Publishers | Lymph Node Blood Vessels: Exit Route for Systemic Dissemination of Cancer

 Lupine Publishers | Journal of Biotechnology & Microbiology


Abstract

There are reports about the existence of meningeal lymphatic vessels in human and nonhuman primates (mormoset monkeys) and feasibility of noninvasively imaging and mapping them in vivo with high-resolution, clinical MRI. On T2-FLAIR and T1-weighted black-blood imaging, lymphatic vessels enhance with graduator, a gadolinium-based contrast agent with high propensity to extravasate across a permeable capillary endothelial barrier, but not with gadofosveset, a blood-pool contrast agent.

Keywords: Meningeal Lymphatic Vessels; The Cerebrospinal Fluid; The Dura Matter; The Superior Sagittal; Transverse Sinuses; Blood Brain Barrier; Chronic Inflammatory Response; Environmental Factors; Crc Risk

Introduction

The topography of these vessels, running alongside Dural venous sinuses, recapitulates the meningeal lymphatic system of rodents. In primates, meningeal lymphatics display a typical panel of lymphatic endothelial markers by immunochemistry. This discovery holds promise for better understanding the normal physiology of lymphatic drainage from the central nervous system and potential aberrations in neurological diseases.

Material and Methods

How does the brain rid itself of waste products? Other organs in the body achieve this via a system called the lymphatic system. A network of lymphatic vessels extends throughout the body in a pattern like that of blood vessels. Waste products from cells, plus bacteria, viruses and excess fluids drain out of the body’s tissues into lymphatic vessels, which transfer them to bloodstream. Blood vessels then carry the waste products to the kidneys, which filter them out for secretion. Lymphatic vessels are also a highway for circulation of white blood cells, which fight infections, and are therefore an important part of the immune system. Unlike other organs, the brain does not contain lymphatic vessels. So how does it remove waste? Some of the brain’s waste products enter via the fluid that bathes and protects the cerebrospinal fluid before being disposed of via the bloodstream. Recent studies in rodents have also shown the presence of lymphatic vessels inside the outer membrane surrounding the brain the dura matter. Some reports show that the dura matter of humans and mormoset monkeys contains lymphatic vessels too. Spotting lymphatic vessels is challenging because they resemble blood vessels, which are much more numerous. In addition, it was found a way to visualize the lymphatic vessels in the dura mater using brain magnetic resonance imaging and could confirm that lymphatic vessels are present in autopsy tissue using special staining methods. For magnetic resonance imaging, monkeys and human volunteers received an injection of a dye-like substance called gadolinium, which travels via the bloodstream to the brain. In the dura mater, gadolinium leaks out of blood vessels and collects inside lymphatic vessels, which show up as bright whiter areas on brain scans. To confirm that the white areas were lymphatic vessels, the experiment was repeated using a different dye that does not leak out of blood vessels. As expected, the signals observed in the previous brain scans did not appear. By visualizing the lymphatic system, this technique makes it possible to study how the brain removes waste products and circulates white blood cells, and to examine whether this process is impaired in aging or disease.

Some reports [1] described the existence of a network of true lymphatic vessels within the mammalian dura mater that runs alongside blood vessels, notably the superior saggital and transverse sinuses. The dural lymphatic vessels display typical immunohistochemical markers that identify lymphatic vessels elsewhere in the body. They provide an alternate conduit for drainage of immune cells and cerebrospinal fluid (CSF) from the brain, beyond previously described pathways of flow: via arachnoid granulations into the dural venous sinuses, and via cribriform plate into the ethmoid region. Although early reports, based on injections of India ink into the cisterna magna of the rat, suggested that the dural pathway accounts for only a minority of the drainage. The more recent studies, based on injections of fluorescent tracers and in vivo microscopy, indicate that the dural system may be substantially more important for drainage of macromolecules and immune cells than previously realized. Whether a similar network of dural lymphatics is present in primates is possible. A noninvasive visualization of the dural lymphatics is necessary to understanding their normal physiology and potential aberrations in neurological diseases. It was verified pathologically the existence of a dural lymphatic network in human and nonhuman primates (mormoset monkeys) and evaluated two magnetic resonance imaging (MRI) techniques that might enable its visualization in vivo.

First, the T2-weighted fluid-attenuation inversion recovery (T2-FLAIR) pulse sequence, which is the clinical standard for detecting lesions within the brain parenchyma and is highly sensitive to the presence of gadolinium-based contrast agents in the CSF. Second, black-blood imaging sequences, which are typically used for measurement of vascular wall thickness or detection of atherosclerotic plaque, are tuned to darken the contents of blood vessels (even when they contain a gadolinium-based contrast agent), but in the process the images may highlight vessels with other contents and flow properties. For comparison, is also required a postcontrast T1-weighted Magnetization Prepared Rapid Acquisition of Gradient Echoes (MPRAGE) MRI sequence, which is widely implemented for structural brain imaging and depicts avid enhancement of dura mater and blood vessels, but which would not be expected to discriminate lymphatic vessels. Cerebral blood vessels have a highly regulated blood-brain barrier, protecting the neutrophil from many contents of the circulating blood. Under physiological conditions, the blood-brain barrier prevents gadolinium-based chaltes in standard clinical use from passing into the Virchow-Robin perivascular spaces and parenchyma, so that these structures do not enhance on MRI. On the other hand, dural blood vessels lack a blood-menigeal barrier [1].

Darwinian selection process promotes spreading of the new long distant tumors. Cancer metastasis, the migration of cells from a primary tumor to form distant tumors in the organism, can be triggered by a chronic leakage of DNA within tumor cells, according to a team led by Weil Cornell Medicine and Memorial Sloan Kettering Cancer Center researchers. How metastasis occurs has been one of the central mysteries of cancer biology. The findings, published in Nature, appear to have partly solved this mystery. The authors traced the complex chain of events that results from chromosomal instability a widespread feature of cancer cells in which DNA is copied incorrectly every time these cells divide, resulting in daughter cells with unequal DNA content. Using models of breast and lung cancer, the investigators found that chromosomal instability leads to changes in the organisms that drive metastasis. They showed that chromosomal instability can cause a leakage of DNA from the nuclei of cancer cells, leading to a chronic inflammatory response within the cells. The cells essentially can hijack that response to enable themselves to spread to distant organs, said study lead author Dr. Samuel Bakhoum, a Holman research fellow at Weill Cornell Medicine and a senior resident in radiation oncology at Memorial Sloan Kettering Cancer Center. The discovery is principally a basic science advance but can also have long-range implications for cancer drug development. Metastasis cause 90 percent of cancer deaths, and this work opens new possibilities for therapeutically targeting it, said senior author Dr. Lewis Cantley, the Meyer Director of the Sandra and Edward Meyer Cancer Center and a professor of cancer biology at Weil Cornell Medicine. Prior studies have linked chromosomal instability to metastasis, although the reason for the link hasn’t been clear. Starting hypothesis was that chromosomal instability generates a lot of genetically different tumor cells, and that a Darwinian selection process promotes the survival of the cells capable of spreading and forming distant tumors, Dr. Bakhoum said. When he injected chromosomally unstable tumor cells into mice, he indeed found that they were many times more likely to spread and form new tumors than tumor cells in which chromosomal instability was suppressed. That was true even though both sets of tumor cells started out genetically identical, with the same abnormal numbers of chromosomes, suggesting that chromosomal instability itself was a driver of metastasis. The researchers examined gene activity in these two sets of tumor cells. They found that those with high chromosomal instability had abnormally elevated activity stemming from more than 1,500 genes particularly in ones involved in inflammation and response of the immune system to viral infections.

These were cancer cells cultured in a dish, not in the presence of any immune cells, Dr. Bakhoum said.

Recent studies by other laboratories offered some clues: Chromosomes in unstable tumor cells can leak out of the cell nucleus where they normally reside. These mis-located chromosomes encapsulate themselves to form micronuclei in the fluid, or cytosol, in the main part of the cell outside of the main nucleus. However, micronuclei tend to rupture, releasing naked DNA into the cytosol. Cells interpret DNA in the cytosol as a sign of an infecting virus, which typically releases its DNA in the cytoplasm when it first attacks a cell. Human cells have evolved to fight this type of viral infection by sensing naked cytosolic DNA using a molecular machine called the cGAS-STING pathway. Once activated, this pathway triggers an inflammatory antiviral program. Lowering cGAS-STING levels reduced inflammation and prevented the ability of otherwise aggressive tumor cells to metastasize when injected into mice. In an ordinary cell, an antiviral response stimulated by DNA leakage from the nucleus would soon bring about the cell’s death. The researchers found, however, that tumor cells have succeeded in suppressing the lethal elements of the cGAS-STING response [2]. At the same time, they use other parts of the response to enable themselves to detach from the tumor and become mobile within the organism. They start as in they were certain kinds of immune cells masking, which are normally activated by infection. In response, they move to the site of infection or injury in the body very quickly. By doing so, cancer cells engage in some form of lethal immune mimicry i. e. masking, and this Darwinian selection process metastasize into the social dynamics on the macroscopic level. The evidence is based on recent studies of metastatic tumor properties, that about half of human metastases originate and expand this way. Researchers are currently investigating strategies for blocking the process.

It might not be feasible to target chromosomal instability itself, since tumor cells are inherently prone to that. Chromosomally unstable tumor cells, with their cytosolic DNA, are basically full of their own poison. Undoing their ability to suppress normal and lethal antiviral response to cytosolic DNA would, in principle, kill these aggressive cancer cells swiftly, with minimal effects on other cells. The next step is to understand better how these cells alter the normal response and how it is possible to restore it [3]. Cancer cells often metastasize by hitching a ride on platelets.

Results

An Alternate Route for Metastatic Cells

Metastatic tumor cells are thought to reach distant organs by traveling through the blood circulation or the lymphatic system. Some studies of mouse models now suggest a hybrid route for tumor cell dissemination. Brown et al. used distinct methodologies to monitor the fate of tumor cells in lymph nodes. They found that tumor cells could invade local blood vessels within a node, exit the node by entering the blood circulation, then go on to colonize the lung. Probably this dissemination route occurs in cancer patients too. This answer could potentially change the way that affected lymph nodes are treated in cancer. During metastasis, malignant cells escape the primary tumor, intravasate lymphatic vessels, and reach draining sentinel lymph nodes before they colonize distant organs via the blood circulation. Although lymph node metastasis in cancer patients correlates with poor prognosis, it’s a question how tumor cells enter the bloodstream via the lymph nodes of mice by microinfusing the cells into afferent lymphatic vessels. Tumor cells rapidly infiltrated the lymph node parenchyma, invaded blood vessels, and seeded lung metastases without involvement of the thoracic duct [4]. These results suggest that the lymph node blood vessels can serve as an exit route for systemic dissemination of cancer cells in experimental mouse models. We are convinced that this form of tumor cell spreading occurs also in cancer patients.

Mysterious RNA Strands Avoid Cell Death

Researchers from Case Western Reserve University School of Medicine have discovered how unusually long strands of RNA help colon cancer cells avoid death, allowing unregulated growth. Unlike other RNAs, the intriguing strands do not appear to encode proteins and are termed “long non-coding RNAs” or “lincRNAs.” A new study showed some lincRNAs could be targeted by drug developers to halt colon cancer. In a new study published in Scientific Reports, researchers compared lincRNA levels inside tumor cells, to levels inside healthy colon cells. They found over 200 lincRNAs at significantly different levels inside the tumor cells as compared to normal cells. One, called lincDUSP, was overexpressed in 91 percent of the tumor samples. A few tumors had more than 15-times the normal amount of lincDUSP. The significant increase suggested this mysterious, a previously uncharacterized, RNA could be cancer-causing. To determine whether lincDUSP shows oncogenic activity in colon cancer, they decided to test the effects of depleting lincDUSP in patient-derived colon tumor cell lines. The researchers genetically modified colon cancer cells to deplete lincDUSP, and surprisingly, the cells began replicating at normal rates. They no longer had unrestricted growth associated with colon cancer tumor cells. Small molecules that inhibit lincDUSP, could have similar effects.

Above work demonstrates that not only protein-coding genes, but also non-coding genes contribute to colon cancer progression, says Ahmad Khalil, Ph.D., senior author, assistant professor of genetics and genome sciences at Case Western Reserve University School of Medicine, and member of the Case Comprehensive Cancer Center. LinsRNAs could be exploited as direct drug targets in this and other human diseases. Khalil’s team discovered that depleting lincDUSP restored inherent cell death mechanisms. Colon cancer cells with low levels of lincDUSP became susceptible to cellular checkpoints that keep growth in check. They immediately committed cell suicide-apoptosis at the first sign of DNA damage. Depleting the single lincRNA also had widespread genetic effects. Khalil’s team discovered that reducing lincDUSP levels affected expression of over 800 other genes. These results, combined with the team’s experiments showing lincDUSP interacting with DNA, add to a growing body of evidence that lincRNAs are central to gene regulation. As such, they could represent an intriguing arena for drug developers. Not much is known about the role of long non-coding RNAs in colon cancer, says A. Khalil. Using new technologies that target RNA molecules instead of proteins, adds a new dimension to cancer therapies [5].

Classical Immune Therapies Also Effective Against Cancer

Tumors that develop at the transition of the stomach to the oesophagus, so called adenocarcinomas of the gastroesophageal transition (AEG), are still difficult to treat and the chances of recovery are still low. Researchers at the Comprehensive Cancer Center of Media Vienna and AKH Vienna have now been able to show that patients with non-metastatic AEGs have a better prognosis if their tumor cells produce the signal molecule PDL1. The study has now been published in the top journal Onco Immunology. In Austria, adenocarcinoma of the gastroesophageal transition is diagnosed in about 500 people every year. This type of tumor with by far the highest increase. Sebastian Schoppman, Department of Surgery (Head M. Gnant) of MedUni Vienna and AKH Vienna, Head of the Gastroesophageal Tumor Unit of CCC and the above study said: The increase is alarmingly high. According to study, one in 100 men in Europe will suffer from AEG in 2030. It is therefore particularly important to treat risk factors such as reflux early enough and to stop others, such as excessive alcohol consumption and smoking. The standard AEG therapy includes the surgical removal of the tumor, followed by chemotherapy and radiotherapy. Immunotherapy is also increasable becoming part of the treatment, which is why the complex process surrounding the immune response are of great interest to cancer researchers. The study team led by Dagmar Kollmann, Department of Surgery at MedUni Vienna and AKH Vienna, first author of the study and member of the CCC, investigated in retrospective work the patterns with which cancer and special defence cells PD-L1 and PD-L2 as well as associated receptor PD-1. They analyzed the tumor material of 168 patients and found that PD-L1 was detectable (expressed) in about 50 percent of cancer cells and TILs [6]. The PD-1 receptor was formed in about 80 percent of the cells. They also found that the expression of PD-L1 in patients who had not been pre-treated with immunotherapy is an independent and powerful predictor of favorable disease progression while the presence of PD-1 is associated with poorer development and an advanced stage of the disease. Their study was able to identify a new biomarker that helps to manage given patients. In addition, the results suggest that all therapies directed against the PD-1 receptor, i.e. classical immune therapies, are also effective in AEG. Loss of muscle mass represents a significant risk to esophageal cancer survival [7].

Environmental Factors Explain a Significant Part of the Crc Risk

Colorectal cancer (CRC) screening of the average risk population is only indicated according to age. Gemma Ibanez-Sanc et al. have elaborated a model to stratify the risk of CRC by incorporation of environmental data and single nucleotide polymorphism (SNP). The MCC-Spain case-control study included 1336 CRC cases and 2744 controls. Subjects were interviewed on lifestyle factors, family and medical history. Twenty one CRC susceptibility SNPs were genotyped. The environmental risk model included alcohol consumption, obesity, physical activity, red meat and vegetable consumption, and nonsteroidal anti-in amatory drug use, contributed to CRC with an average per factor OR of 1.36 (95% Cl 1.27 to 1.45). Family history of CRC contributed an OR of 2.25 (95% Cl 1.87 to 2.72), and each additional SNP contributed an OR of 1.07 (95% Cl 1.04 to 1.10). The risk of subjects with more than 25 risk alleles was 82% higher (OR 1.82, Cl 1.11 to 2.98) the subjects with less 19 alleles. This risk model, with an AUROC curve of 0.63(95% Cl 0.60 to 0.66), could be useful be considered to encourage patients to achieve healthier lifestyle. Colorectal cancer screening by faecal occult blood testing has been demonstrated to reduce CRC incidence and mortality, as well as being a cost-effective strategy compared to no screening. Recent evidence of the balance of cancer screening has led to proposals for more personalized strategies based on individual cancer risk effectiveness of a screening strategy depends on the average cancer risk of the target population. Until today, the target population tested basically by age (>50 years old), which has been called one-size-is-all strategy.

This strategy implies performing unnecessary screening tests in low-risk people leading to avoidable risks for patients and extra costs for the health care system. On the other hand, high-risk patients may receive non-invasive testing, which is a suboptimal screening technique in their case. A risk based CRC screening that included environmental risk factors, family history of CRC, and information derived from genetic susceptibility loci could improve not only the efficacy of the screening program but also the adherence of highrisk patients when properly informed of their personal risk. Several prediction models, either for CRC or advanced neoplasia, have been previously developed, all with limited discriminating ability. These studies have encompassed that traditional environmental risk factors for CRC including age, sex, family history of CRC, smoking, alcohol, Body Mass Index (BMI), acetylsalicyclic acid (ASA), physical activity, diet, some drugs (nonsteroidal anti-inflammatory drugs (NSAID), calcium and vitamins. Furthermore, with the identification of CRC-associated common single-nucleotide polyphormisms (SNPs), a few studies have added genetic-susceptibility information together with some of the clinical risk factors. Each common lowpenetrance allele is associated with a small increase in risk of CRC, but the combined test of multiple SNPs may achieve a higher degree of risk discrimination, which could be useful to stratify the population. In above study the researchers have developed a risk stratification model that combines environmental factors with family history and genetic susceptibility. They have interpreted the relative contribution of these factors and the utility of the model for risk stratification and public health intervention [6].

The above study assessed the potential utility of a risk prediction model for CRC that combines modifiable risk factors with family history of CRC and a genetic risk score based on 21 susceptibility SNPs. They have observed that modifiable risk factors have a stronger value for risk prediction than does genetic susceptibility. Though the added value of each SNP is small, the combination of 21 SNPs adds significantly to the power of the risk model. The study was large enough to confirm that established risk factors are associated with risk family history of CRC, high consumption of alcohol, obesity, lack of physical activity in leisure time, high intake of red meat, low intake of vegetables, and nonuse of NSAIDs/ASA. These risk factors were selected based on previous evidence reported in systematic reviews and metaanalyses. All were independent predictors of CRC in an average risk population, except for smoking, which was only significant in the univariate analysis. A recent meta-analysis on smoking has shown that the effect is small for CRC, with summary of smaller than 1.25, and larger for rectal than colon cancer. They also analyzed other covariant that have been associated with CRC (diabetes mellitus, inflammatory bowel disease, and diverticulitis), but they were not associated with CRC, perhaps because of the smaller number of a detected individuals. Nor was intake of vitamin D, calcium, or folic acid associated with CRC. Not was included statins in the model since there is controversy regarding these drugs and CRC risk.

Study confirms that the family history of CRC is the strongest single risk factor for CRC. They found a significant association between the CRC and family history and genetic factors, which highlights the importance of genetic susceptibility in CRC. Family history could also contribute to risk through shared lifestyle or environmental factors. Also, gene-environment interactions may play role in this type of cancer.

On average, each environmental factor increases risk by 35%, while each risk allele only increases risk by 7%. It implies that the change of one modifiable risk factor towards healthier lifestyle might set the effect of 4 risk alleles. Given the fact that environmental factors explain a significant part of the CRC risk, we believe it too important to give thought to incorporating clinical data to improve current screening and also encourage patients to achieve a healthier lifestyle [7].

Acknowledgement

The author gratefully acknowledge the assistance of Dr.Marta Ballova, Ing. Konrad Balla, Livuska Ballova, and Ing. Jozef Balla.

Read More About Lupine Publishers Journal of Biotechnology Please Click on Below Link:
https://lupine-biotechnology-microbiology.blogspot.com/

Thursday, 14 October 2021

Lupine Publishers | Effectiveness of Textile Materials in Gynaecology and Obstetrics

 Lupine Publishers | Journal of Reproductive System


Abstract

The article reviews some significant advances in the use of textile materials in obstetrics and gynaecological procedures. Some developments in texture suture materials have been highlighted. Despite millennia of experience with wound closure biomaterials, no study or surgeon has yet identified the perfect suture for all situations. In recent years, a new class of suture material-barbed suture-has been introduced into the surgeon's armamentarium. Focus has been directed on barbed suture to better understand the role of this newer material in obstetrics and gynaecology. Cellulose nonwoven modified with chitosan nano particles has been developed and its physical-chemical, morphological and physical-mechanical aspects characterized in order to explore their possible use in medicine as gynaecological tampons. Tampons have been developed using viscose fibres coated by chitosan dissolve in acetic or lactic acids both inhibit the growth of micro organisms and adjust the pH. Such a tampon proved better than existing ones and thereby proves beneficial for pregnant women.

Keywords: Suture material; Suture characteristics; Barbed sutures; Medical tampons; Viscose; Chitosan; Wound closure

Introduction

The relationship between wound closure biomaterials and surgery dates back as far as 4000 years, when linen was used as a suture material. The list of materials used to close wounds has included wires of gold, silver, iron, and steel; dried gut; silk; animal hair; tree bark and other plant fibers; and, more recently, a wide selection of synthetic compositions. Despite millennia of experience with wound closure biomaterials, no study or surgeon has yet identified the perfect suture for all situations. Natural polymers like cellulose, starch and chitosan find use in pharmacy and medicine due to their desired properties like biocompatibility, lack of toxicity and allergenic action [1,2]. Prepared from natural polymers are materials that mimic the extracellular matrix; they reveal a soft and strong but also elastic structure which provides mechanical stability to tissue and organs [3,4].

The availability and low price is the economic assets of natural polymers. Environment protection issues, now strongly pronounced by the European Union, also speak for the use of natural polymers, which are seen as environmentally-friendly. In general, natural polymers enjoy a growing interest. They are primarily used in modern medical devices contributing to advanced healing procedures. Chitosan counts as a polymer and is in abundance in nature, revealing beneficial biomedical properties like antibacterial activity against Escherichia coli and Staphylococcus aureus, which is primarily responsible for septic shock [5-7]. During the recent years, cellulosic fibres have been used in the development of medical textile products as proved by the literature available [8,9]. Owing to their active surface area, strength and molecular structure, cellulosic fibres exhibit enormous possibilities in the design of bioactive, biocompatible, and advanced materials [10]. One such area of application could be cellulose tampons which are used by women and have biodegradable and antimicrobial properties. It protects from physiological and pathological vaginal discharge, which could otherwise increase the vaginal pH beyond the desirable limit of 3.6-4.5 [11].

Developments in Suture Materials

A perfect suture would have the following properties:

    a. Adequate strength for the time and forces needed for the wounded tissue to heal

    b. Minimal tissue reactivity

    c. Comfortable handling characteristics

    d. Unfavourable for bacterial growth and easily sterilized

    e. Nonelectrolytic, noncapillary, nonallergenic, and noncarcinogenic

This review discusses the wound healing process and the biomechanical properties of currently available suture materials to better understand how to choose suture material in obstetrics and gynaecology. Inflammatory tissue reactions due to the presence of suture material will persist as long as the foreign body remains within the tissue. Determining the balance between the added strength the suture provides to the tissues while they heal versus the negative effects of inflammation is central to choosing the proper suture. Irrespective of the knot configuration and material, the weakest spot in a surgical suture is the knot and the second weakest point is the portion immediately adjacent to the knot, with reductions in tensile strength reported from 35% to 95% depending on the study and suture material used. Applying our current understanding of the wound healing process and the biomechanical properties of the variety of available suture materials, obstetricians and gynaecologists should choose suture material based on scientific principles rather than anecdote and tradition.

Stages in Wound Healing

The following stages are involved in wound healing and inflammatory responses

    A. Inflammation [12-16]

    B. Proliferation [17]

    C. Maturation an remodelling [12,13]

Suture Characteristics that Assist Surgeons

The following are the different categories of suture classification that are considered to best assist surgeons in choosing the proper suture material for their surgeries. These are:

    i. Suture size [18].

    ii. Tensile strength [19-21].

    iii. Absorbable versus nonabsorbable [22-27].

    iv. Multifilament versus monofilament [28-31]

    v. Stiffness and flexibility [32,33].

    vi. Smooth versus barbed [34-42].

    vii. Barbed suture [43].

Practical Aspects to be considered in Suture Materials

A perfect suture has adequate strength for the time and forces needed for the wounded tissue to heal; minimal tissue reactivity; comfortable handling characteristics; is unfavourable to bacterial growth and easily sterilized; and is nonelectrolytic, noncapillary, nonallergenic, and noncarcinogenic.

    a) Inflammatory tissue reactions due to the presence of suture material persist as long as the foreign body remains within the tissue. The degree of tissue reaction depends largely on the chemical nature and physical characteristics of the suture material.

    b) Suture classifications that best assist surgeons in choosing the proper suture material for surgery include suture size, tensile strength, absorbability, structure, flexibility, and surface texture.

    c) The perfect suture material should retain adequate strength throughout the healing process and disappear afterward with minimal associated inflammatory reaction.

    d) Irrespective of the knot configuration and material, the weakest spot in a surgical suture is the knot and the second weakest point is the portion immediately adjacent to the knot, with reductions in tensile strength reported from 35% to 95% depending on the study and suture material used.

    e) Bidirectional barbed sutures may offer multiple advantages: they eliminate the need for a knot, which effectively reduces wound tissue reactions; there is a more uniform distribution of wound tension across the suture line, yielding more consistent wound opposition; and the secure anchoring of barbed suture at 1 mm intervals may provide a reduction in gaps and thereby create a more "watertight" seal. On the downside, currently available barbed sutures are produced in a limited variety of materials and sizes.

Reflecting the age-old dictum, "It's always important to never say always and never," there is no one best suture or suture material for all surgical procedures. Although sutures have been reflecting the age-old dictum, "It's always important to never say always and never," there is no one best suture or suture material for all surgical procedures. Although sutures have been complications, surgeons must constantly review not only their technique, but the adjuvant materials they use in their craft. This review focused on absorbable suture materials for use in basic obstetric and gynaecologic procedures. It is meant to be neither comprehensive nor definitive. Rather, it is intended to introduce newer technologies and reinforce old concepts. Applying our current understanding of the wound healing process and the biomechanical properties of the variety of available suture materials, obstetricians and gynaecologists should choose suture material based on scientific principles rather than anecdote and tradition. Tissue characteristics, tensile strength, reactivity, absorption rates, and handling properties should be taken into account when selecting a wound closure suture. The currently available suture materials and their relative general characteristics have been listed [44].

With these considerations in mind, in most obstetric and gynaecologic procedures (excluding suspension procedures and oncologic procedures in which either adjuvant chemotherapy and/ or radiation therapy is anticipated), there is little role for either nonabsorbable sutures or collagen gut sutures [45]. The newer synthetic absorbable sutures consistently display both theoretical and clinically proven advantages for wound healing over their older, naturally derived cousins. The introduction of bidirectional barbed sutures has the potential to dramatically alter the wound closure landscape by both equalizing the distribution of disruptive forces across the suture line and eliminating the need for surgical knots.

Use of Barbed Sutures

Sutures and surgery have been tied together since the first operations were performed. Throughout the history of surgery, the variety of materials used to close wounds has included wires of gold, silver, iron, and steel; dried gut; silk; animal hairs; tree bark and other plant fibers; and, more recently, a wide selection of synthetic compositions. Despite the multitude of different procedures performed with a host of different wound closure biomaterials, no study or surgeon has yet identified the perfect suture for all situations. In recent years, a new class of suture material-barbed suture-has been introduced into the surgeon's armamentarium. Currently, there are 2 commercially available barbed suture products: the Quill™ SRS bidirectional barbed suture product line (Angiotech Pharmaceuticals, Inc., Vancouver, BC, Canada) and the V-Loc™ Absorbable Wound Closure Device product line (Covidien, Mansfield, MA). These synthetic sutures eschew the traditional, smooth, knot requiring characteristic of sutures in favour of barbs that serve to anchor the sutures to tissue without knots. This review focuses specifically on barbed suture to better understand the role of this newer material in obstetrics and gynaecology. Given the paucity of published data on the V-Loc sutures, the review will mostly focus on Quill bidirectional barbed sutures.

Key Considerations

In the author's opinion, there are few scientific data to support the current use of either plain or chromic gut sutures in any surgical procedure. A recent study of porcine gastrointestinal closure burst- strength pressures in wounds closed with barbed suture were no different than repairs performed with traditional knotted, smooth suture lines. When considerations for blood loss and hemostatis are added, the need for faster, more secure suture lines becomes readily apparent. To this end, barbed suture materials are an ideal solution. As with myomectomy closures, hysterotomy closures during caesarean delivery are facilitated by the use of barbed suture. The barbed sutures more easily draw the tissue edges together and the 1-mm spacing between the barbs seems to yield better hemostasis.

Important Practical Aspects

    I. A new class of suture material-barbed suture-has been introduced; these synthetic sutures eschew the traditional, smooth, knot requiring characteristic of sutures in favour of barbs that serve to anchor the sutures to tissue without knots.

    II. The 6 categories of suture classification believed to best assist surgeons in choosing the proper suture material for their surgeries are suture size, tensile strength, absorbability, filament construction, stiffness and flexibility, and surface characteristics (smooth or barbed).

    III. A knot-secured, smooth suture creates an uneven distribution of tension across the wound. Although the closed appearance of a wound may be that of equal tension distribution, there is unequal tension burdens placed on the knots. This tension gradient across the wound may subtly interfere with uniform healing and remodelling.

    IV. Although the data are limited and almost exclusively based on studies with bidirectional suture, barbed suture lines appear to be at least as strong if not stronger than traditional, knotted, smooth suture lines.

    V. To choose the best suture material for an obstetrics- gynaecology procedure, surgeons should take into account all the variables present, such as a tissue's collagen structure, blood supply, disruptive forces, and potential for infection. When these characteristics are considered, the physical characteristics of barbed sutures make these materials an attractive option.

Comparison between Smooth and Barbed Sutures

In 1956, Dr. J. H. Alcamo was granted the first.com patent for a unidirectional barbed suture, although the concept dates back to 1951 when the idea of using barbed sutures was presented for tendon repairs [46,47]. The first.com Food and Drug Administration (FDA) approval for barbed suture material was issued in 2004 to Quill Medical, Inc., for its Quill bidirectional barbed polydioxanone suture [48]. In March 2009, the FDA approved the V-Loc 180 barbed suture from Covidien. Whether bidirectional or unidirectional barbed suture is better is unknown, although there are reported complications of unidirectional barbed sutures migrating or extruding [49,50].

This problem is thought to have been due to the lack of counterbalancing forces on the suture line. Barbed sutures are available in a variety of both absorbable and nonabsorbable monofilament materials. Specifically, currently available bidirectional and unidirectional barbed suture materials include PDO, polyglyconate, poliglecaprone 25, glycomer 631, nylon, and polypropylene. Bidirectional barbed sutures are manufactured from monofilament fibers via a micromachining technique that cuts barbs into the suture around the circumference in a helical pattern. The barbs are separated from one another by a distance of 0.88 to 0.98 mm and are divided into 2 groups that face each other in opposing directions from the suture midpoint (Figure 1) [51].

Figure 1:

Lupinepublishers-openaccess-Reproductive-Sexualdisorder

Needles are swaged onto both ends of the suture length. Owing to its decreased effective diameter as a result of the process of creating barbs, barbed suture is typically rated equivalent to 1 USP suture size greater than its conventional equivalent. For example, a 2-0 barbed suture equals a 3-0 smooth suture. Unidirectional barbed sutures are similarly manufactured from monofilament fibers, but needles are swaged onto only 1 end whereas the other end maintains a welded closed loop to facilitate initial suture anchoring (Figure 2). Unlike bidirectional barbed suture, unidirectional barbed suture is rated equal in strength to its USP smooth suture counterpart. However, this strength rating difference between the 2 barbed varieties is the result of labeling differences rather than an actual material benefit.

Figure 2:

Lupinepublishers-openaccess-Reproductive-Sexualdisorder

Importance of suture knots

It is difficult for many surgeons to think about suture material without an accompanying knot. Nonetheless, the surgical knot used with a length of smooth sutures is a significant necessary evil that is accepted as the only irrefutable means to anchor suture material within a wound. A knot-secured, smooth suture inevitably creates an uneven distribution of tension across the wound. Although the closed appearance of a wound may be that of equal tension distribution, there is unequal tension burdens placed on the knots rather than on the length of the suture line. This tension gradient across the wound may subtly interfere with uniform healing and remodelling. The weakest spot in any surgical suture line is the knot. The second weakest point is the portion immediately adjacent to the knot, with reductions in tensile strength reported from 35% to 95% depending on the study and suture material used [52,53].

When functional biomechanics are considered, this finding should not be surprising considering both the effects of slippage of suture material through the knot and the unavoidable suture elongation that occurs as a knot is formed and tightened. Given the excessive relative wound tension on the knot and the innate concerns for suture failure due to knot slippage, there is a predilection toward overcoming these concerns with excessively tight knots. However, surgical knots, when tied too tightly, can cause localized tissue necrosis, reduced fibroblast proliferation, and excessive tissue overlap, all of which lead to reduced strength in the healed wound [54]. A surgical knot represents the highest amount and density of foreign body material in any given suture line. The volume of a knot is directly related to the total amount of surrounding inflammatory reaction [55]. If minimizing the inflammatory reaction in a wound is important for optimized wound healing, then minimizing knot sizes or eliminating knots altogether should be beneficial as long as the tensile strength of the suture line is not compromised. Finally, with minimally invasive laparoscopic surgeries, the ability to quickly and properly tie surgical knots has presented a new challenge. In cases where knot tying is difficult, the use of knotless, barbed suture can securely re-approximate tissues with less time, cost, and aggravation [56,57].

Although the skills necessary to properly perform intra- or extracorporeal knot tying for laparoscopic surgery can be achieved with practice and patience, this task is a difficult skill that most surgeons still need to master to properly perform closed procedures. In addition, laparoscopic knot tying is more mentally and physically stressful on surgeons and, more importantly, laparoscopically tied knots are often weaker than those tied by hand or robotically [58-62].

Barbed Sutures in Practical Use

The choice and use of sutures in obstetrics and gynaecology (ob- gyn) is based more on anecdote and experience than data. Though many of the suture materials routinely used in myomectomies, hysterectomies, and caesarean deliveries have endured the test of time, this should preclude neither the application of scientific review nor the quest for improvement. In addition to understanding the physical properties and characteristics of the variety of available sutures, surgeons need to consider the tissue and physiologic milieu into which suture will be placed before choosing the material to use. For example, in general, the suture-holding strength of most soft tissues depends on the amount of fibrous tissue they contain. Thus, skin and fascia hold sutures well whereas brain and spinal cord tissue do not. Further along this line, healthier tissues tend to support sutures better than inflamed, edematous tissues.

To choose the best suture material for an ob-gyn procedure, surgeons should take into account all the variables present, such as a tissue's collagen structure, blood supply, disruptive forces, and potential for infection. When these characteristics are considered, the physical characteristics of barbed sutures make these materials an attractive option. The first use of barbed sutures in gynaecologic surgery was reported by Greenberg and Einarsson in 2008 [56]. Since that report, numerous print and video publications have followed. Inprocedures such as laparoscopic myomectomy and hysterectomy, the use of barbed sutures has become commonplace. Myomectomy re-approximation of the myometrium after removal of myomas requires a suture material that adequately addresses the need for a prolonged wound disruptive-force reduction, hemostasis, and minimal tissue reactivity.

Traditionally, this suture has been either a polyglycolic acid suture or polydioxanone. However, as noted earlier, braided sutures cause more tissue abrasion and inflammation than monofilaments, and the transition from open to closed procedures has introduced the difficulty of laparoscopic suturing. When considerations for blood loss and hemostatis are added, the need for faster, more secure suture lines becomes readily apparent. To this end, barbed suture materials are an ideal solution. Their synthetic, monofilament configurations should minimize local inflammation, and their absorption profiles and tissue pull-through strengths are well within the parameters needed for reduction of disruptive forces. Further, because barbed sutures allow for only minimal tissue recoiling, closing spaces such as myomas defects is easier with each subsequent suture pass exposed to less tension than the previous bite. Finally, without the need for knot tying, wound closure times and blood loss are significantly reduced [63-65].

Barbed sutures are used in obstetric and gynaecologic procedures in following areas

    a) Hysterectomy [66-71].

    b) Sacrocolpopexy [72-74].

    c) Caesarean delivery [75-79].

Barbed suture is a relatively new but exciting addition to the variety of suture materials. As experience grows with barbed sutures, more applications for its use will likely arise [80]. Obstetric and gynaecologic surgeons who are interested in choosing the best materials for their operations should benefit from better understanding the underlying principles of wound healing and suture material biomechanics, and may discover many advantages to the use of barbed suture.

Chitosan Modified Cellulosic Nonwoven Material

It also displays activity against the fungi Candidiaalbicans, which often produces vaginal candidiasis [81,82], and antiviral action, for example against the human papilloma virus (HPV), which is the cause of cervical carcinoma [83-85]. Thanks to its ability of controlled slow- release, chitosan is often used as carrier of active substances. Various sterilization methods can be employed for chitosan without upsetting its structure and physical-chemical properties [86]. Its beneficial properties ensure chitosan is widely used in pharmacy and medicine as a safe, non-toxic polymer originating from nature [86,87]. Both natural and synthetic polymers like PLA, poly (DL-lactideco-glycolide) and PP are often modified with chitosan [88-91]. The process is intended to prepare new materials with beneficial biological, physical-chemical and mechanical properties [91-93]. The modified composite materials obtained open wide avenues of application and contribute to the introduction of new healing techniques. The incorporation of chitosan into the cellulose matrix yields devices with high biological activity and good mechanical strength.

The medical devices have potential use in gynaecology as medical materials with beneficial biological and mechanical properties. Basic cellulose material with built-in chitosan nanoparticles provides optimal and controlled diffusion of chitosan to the mucous membrane of the vagina and ovarium. Tampons holding antimicrobial chitosan particles may also find use as post-operation dressings and in the healing of diseases and infections due to high antimicrobial activity; alternatively they may be employed for carrying active substances. This is crucial because infections and gynaecological ailments are problems women are mostly plagued with. More than 40 microorganisms are the reason for infections in the region of female sexual organs. The copolymer system proposed also permits to minimalise irritation due to the adjustment of chitosan concentration as the active agent. It is also hoped that the more developed surface of the biopolymer material shall provide better contact with the vagina and neck of the uterus, thus enhancing the antimicrobial effect. The use of modern biopolymer medical devices opens new ways in medicine.

Assessment of the impact of chitosan nanoparticles added to a cellulose matrix upon biological activity and toxicity was the aim of the research conducted. Structural examinations and estimation of chemical purity, as well as antibacterial and useful properties were made. Prepared was a system to control the medical device in respect of microorganism growth and fulfilment of quality requirements of medical devices [94,95]. This article presents preliminary studies on the assessment of the effect of modification of cellulose nonwoven with nanoparticles of chitosan upon biological activity, toxicology and mechanical properties. The addition of nano chitosan was supposed to influence antimicrobial properties; however it was necessary to verify the correlation between the concentration of chitosan and the activity. A very important purpose was to examine how the addition of chitosan would influence the mechanical properties and chemical purity. It is necessary to guarantee optimum activities and the safety of human life and health simultaneously. Therefore according to the standards and scientific literature available, research methods were selected and used to assess cellulose nonwovens for their possible use in medicine as gynaecological tampons. Cellulose fibres used in the preparation of the nonwovens have good properties, both physical- mechanical and physical chemical, including chemical purity.

The addition of chitosan in the amount of 0.25 and 0.5% to the nonwoven caused an inhibition of the growth of E. coli, while such an effect was not observed with S. Aureus. Nonwoven with a 1.4% content of chitosan completely stopped the growth of bacteria E. coli and S. Aureus. The initial material revealed antifungal activity against C. Albicans on at the level of 91.7%; the addition of 0.5% of chitosan entirely inhibited the growth of microorganisms. Antifungal activity of the nonwoven with 0.25% of chitosan against C. albicans was slightly below 100%. Based on the results of physical-mechanical testing, it was found that the nonwoven with 0.5% of chitosan has the best mechanical and useful properties [96]. Assessment of the chemical purity of the materials tested points, at best, to useful properties in the case of nonwoven with 0.5% of chitosan. The structure of the fibres examined after extraction simulating normal use was not disturbed, which may evidence safe use of the hygiene material manufactured. Summarizing the results of the examinations, it may be concluded that the initial cellulose fibres are a good raw material for use in the preparation of innovative medical devices.

Use of Chithsan Based Viscose Material

An increase in pH may cause a reduction in the natural defence of the vagina. This suggests conduciveness for development of bacteria and hence prone to development of bacteria [11-12,9798]. A number of techniques and formulations have been evolved in order to manage the problem of increased pH value of the vagina, than the normal [13,99]. But these have been unsatisfactory from the practical point of view considering the usage and acceptability. Therefore, the development of materials for prevention and treatment of gynaecological infections still represent a significant challenge. Chitosan is the next most abundant polysaccharide found on earth, in comparison with cellulose. It is a biopolymer that is polycationic and covers a broad spectrum of medical properties activity, such as antibacterial, antifungal, and haemostatic properties [100]. Chitosan possesses many desirable characteristics with regard to vaginal infections. It is effective against vaginal candidiasis, toxic shock syndrome, treatment of ovarian cancer, and ensures safe pregnancy and proves conducive in avoiding premature child birth.

Chitosan is found highly suitable for adsorption onto cellulose fibres and thereby impart antimicrobial activity. The functionalization of viscose cellulose using chitosan has been investigated. The objective is to assess the potential use of such a material for the development of new tampons, which apart from maintaining/creating the desired physiological pH value would also possess antibacterial and antimycotic properties. The tampons do not exhibit negative effects, such as inflammation risks, and infections from yeasts, and on repeated use, help to sustain the required moisture in the vagina.

Conclusion

Textile materials have made their entry into many areas of medical textiles, of which gynaecology and obstetrics is one such. Textile sutures are one well explored area. In the choice of a wound closure suture tissue characteristics, tensile strength, reactivity, absorption rates, and handling properties should be considered. The wound healing process and the biomechanical properties of currently available suture materials have been reviewed to better understand how to choose suture material in obstetrics and gynaecology. Despite the multitude of different procedures performed with a host of different wound closure biomaterials, no study or surgeon has yet identified the perfect suture for all situations. In recent years, a new class of suture material-barbed suture-has been introduced into the surgeon's armamentarium. The barbed suture has been studied to better understand the role of this newer material in obstetrics and gynaecology. The impact of the addition of chitosan nanoparticles upon the biological activity and toxicity of the materials prepared.

Methodology was prepared for the examination of the gynaecological devices in the range of their useful properties, notably the mechanical strength, surface density and absorption. Aqueous extracts were examined after an extraction process that simulated standard use of the medical device, and after a surplus extraction. The content of water-soluble-, surfactant- and reductive substances was estimated as well as the contents of heavy metals like cadmium, lead, zinc and mercury by the ASA method. Morphology examination permitted to assess the impact of the extraction processes on the fibre structure. Antibacterial activity against Escherichia coli and Staphylococcus aureus, and antifungal activity against Candida albicans was measured. Altogether examinations were made to assess whether the cellulosic nonwoven modified with chitosan nanoparticles meets the demands of medical devices and lends itself to the manufacture of tampons. The suitability of chitosan-acetic acid treated tampons for gynaecological use has been evaluated. The use of such tampons could prove beneficial for pregnant women, as in vitro trials have confirmed resistance of the tampons against Streptococcus Agalactiae bacteria, which pose serious problems for pregnant women and their infants.

Read More About Lupine Publishers Journal fo Reproductive System Please Click on Below Link:
https://lupine-publishers-reproductive.blogspot.com/

Tuesday, 12 October 2021

Lupine Publishers| Utilization of Direct Acting Oral Anticoagulants in Patients with Liver Cirrhosis: Is It Safe?

 Lupine Publishers| Journal of Gastroenterology and Hepatology


Abstract

Patients with liver cirrhosis are known to have increased risk bleeding particularly from gastrointestinal tract. However, recent literature has shown that patients with liver cirrhosis are also at increased risk of thrombotic complications. Therefore, it is important to consider anticoagulation in cirrhotic patients. The purpose of this article is to review the epidemiological studies available in scientific literature comparing the risk of bleeding in cirrhotic patients utilizing DOACs vs traditional anticoagulation.

Abbreviations: INR: International Normalized Ratio; VTE: Venous Thromboembolism; DVT: Deep Venous Thrombosis; PE: Pulmonary Embolism; PTT: Partial Thromboplastin Time; LMWH : Low-Molecular-Weight Heparin; NO: Nitric Oxide; PTT: Partial Thromboplastin Time; DOACs: Direct Oral Anticoagulants; AT: Antithrombin; ULN: Upper Limit Of Normal; AUC: Area Under The Plasma Concentration-Time Curve; PD pharmacodynamics; PK: pharmacokinetics

Introduction

Liver plays central physiologic role in hemostasis as it synthesizes the majority of the procoagulant and anticoagulant factors. The levels of these factors are markedly affected by decrease function of liver associated with cirrhosis resulting in abnormal hemostatic mechanism. Generally, the impression in the clinical world is that liver cirrhosis is associated with decrease synthesis of procoagulant factors resulting in increased risk of bleeding. This phenomenon is known as auto-anticoagulation and is supported by elevated international normalized ratio (INR) and low platelet count usually observed in cirrhotic patients. In this regard, gastrointestinal bleeding and more specifically variceal bleed is of major concern since they contribute significantly to the mortality of patients with liver cirrhosis. At least, 30% mortality has been reported at the first episode with a 70% recurrence rate in this patient population and a one year survival estimate ranging from 32% to 80% [1]. However, decrease function of cirrhotic liver also results in reduce level of anticoagulant factors including anti thrombin III, protein S, and C which may result in increased tendency to form clots. Interestingly, recent data also indicates cases of venous thromboembolism (VTE) including both deep venous thrombosis (DVT) and pulmonary embolism (PE) in cirrhotic patients ranging between 0.5% to 6.3% [2-10]. Dabbagh et al. [4] found that even an elevated INR > 2.2 was not protective against VTE in this patient population [4]. Gulley D et al. [10] noted that hospitalized cirrhotic patients without predisposing co-morbidities (e.g. neoplasm, congestive heart disease and chronic renal failure) had similar risks for VTE as compared to non cirrhotic patients [10]. Thus, the myth of auto-anticoagulation seems to be only partially true. Therefore, the abnormal routine blood tests (like elevated INR, Partial Thromboplastin Time [PTT], high MELD score and low platelet count) may indicate increased hemorrhage risk in this patient population which may not be completely protective against risk of VTE as these tests do not accurately reflect the activity of aforementioned anticoagulant factors in the serum. As a result, utilization of anticoagulation is now being increasingly encouraged in cirrhotic patient population to avoid thrombotic complications.

Direct oral anticoagulants (DOACs) are relatively newer class of anticoagulants which selectively inhibit factor Xa (for e.g. Apixaban, Rivaroxaban, and Edoxaban) and factor IIa (for e.g. Dabigatran) of the coagulation cascade. Being able to be administered orally, rapid onset of action, lack of heparin induced thrombocytopenia, fewer interactions, and non requirement of laboratory monitoring are some of the advantages that DOACs carry over the traditional anticoagulant agents. DOACs may also be helpful in management of portal vein thrombosis (PVT) and portal hypertension (pHTN) in patient with cirrhosis as case reports have been reported about PVT controlled by Rivaroxaban treatment [11-13]. Vilaseca M et al. [14] investigated the effect of Rivaroxaban on various mediators of portal hypertension in CCl4 and thio acetamide-cirrhotic rats. Rivaroxaban significantly decreased portal pressure in both models of cirrhosis by reducing oxidative stress, improving nitric oxide (NO) bioavailability, and ameliorating endothelial dysfunction. Rivaroxaban also markedly reduced intrahepatic microthrmbosis by reduced fibrin deposition and deactivated hepatic stellate cells which plays major role in increasing intrahepatic vascular resistance by promoting fibrogenesis [14]. The purpose of this article is to review the epidemiological studies available in scientific literature comparing the risk of bleeding in cirrhotic patients utilizing DOACs vs traditional anticoagulation.

Methods

An electronic Medline search was conducted using the key terms anticoagulation, oral anticoagulant, direct acting oral anticoagulant, novel oral anticoagulant, direct thrombin inhibitors, direct factor Xa inhibitors, Apixaban, Rivaroxaban, Dabigatran, Edoxaban, liver cirrhosis, chronic liver disease, and decompensated liver disease. Studies written in the English from January 2000 to March 2018 were considered for this review article. All search results were reviewed.

Results

Hum J et al. [15] conducted a retrospective cohort study to compare the efficacy and safety of direct oral anticoagulants vs traditional anticoagulants in 45 patients with cirrhosis who were prescribed therapeutic anticoagulation over a 3-year period for thrombosis or prevention of stroke in patients with atrial fibrillation. 27 patients were prescribed one of the DOACs and 18 were prescribed vitamin K antagonist or low molecular weight heparin (LMWH). Similar total bleeding events (8 with DOACs vs 10 with other, P=0.12) were observed in the two groups but DOACs group had significantly less major bleeding events (1 [4%] vs 5 [28%], P = 0.03) [15]. In another retrospective study, Intagliata N et al. [16] compared the rates of bleeding in cirrhotic patients treated with DOACs (Rivaroxaban and Apixaban) to the cirrhotic patients treated with traditional anticoagulation (coumadin and LMWH) using a research database. The sample size consisted of 39 patients who received anticoagulation therapy over a 3-year period. 20 patients received DOACS and 19 received traditional anticoagulation. No significant difference in bleeding was observed in the two groups (three in the traditional anticoagulation group and four in the DOACS group, p = 0.9). Three major bleeding events were noted including two in the traditional anticoagulation group and one in the DOACS group [16]. Nagaoki Y et al. [17] also conducted a retrospective cohort study in fifty cirrhotic patients comparing the efficacy and safety of Edoxaban and warfarin for treatment of portal vein thrombosis (PVT). After treating for two weeks with danaparoid sodium, patients were switched to either Edoxaban (n = 20) or coumadin (n = 30). The efficacy and safety of Edoxaban and warfarin was compared for up to 6 months. Clinically significant gastrointestinal bleeding was encountered in 3 of 20 (15%) patients of the Edoxaban group and 2 of 30 (7%) of the warfarin group but the difference was not statistically significant (P = 0.335) [17]. In a relatively larger retrospective cohort study, Goriacko P et al. [18] compared the rate of bleeding in chronic liver disease patients with atrial fibrillation treated with oral anticoagulants (coumadin vs DOACs). No significant difference in all-cause bleeding (HR 0.9, 95% CI 0.4-1.8) and major bleeding were observed between the two groups [18].

Conclusion

The data on the safety of DOACs in patients with liver cirrhosis is in very initial stages. Based on our Medline literature search, we were able to find four studies comparing the risk of bleeding in cirrhotic patients utilizing DOACs vs traditional anticoagulation. All studies reported either decrease bleeding events in patients with liver disease treated with DOACs as compared to patients treated with traditional anticoagulation or no significant difference in bleeding risk. However, these studies were limited by retrospective nature, small sample size, and lack of randomization. Due to retrospective nature, underreporting of the bleeding events may have resulted in the underestimation of the risk of hemorrhage in these studies. Lack of randomization may have resulted in the underutilization of DOACs in cirrhotic patients at higher risk of bleeding such as those with high INR, low platelet count and presence of esophageal varices. This may have also confounded the results of these studies.

One of the major concerns regarding utilization of DOACs in cirrhotic patient population is the lack of specific antidotes in face of life-threatening gastrointestinal bleeding or urgent invasive procedure. Since DOACs have long half-life, drug discontinuation is insufficient in these circumstances. Recently, three agents including Idarucizumab, andexanet alfa, and ciraparantag have been introduced with promising antidotal effect against the DOACs [19]. Idarucizumab is the only agent to date which has been approved for use and is specific to Dabigatran [19]. Andexanet alfa is specific to factor Xa inhibitors and is still under investigation [19]. Ciraparantag is a universal antidote and is in earlier stages of development [19]. In a study on healthy volunteers, Prothrmobin complex concentrate has been demonstrated to reverse the anticoagulant effect of Rivaroxaban and Dabigatran [20]. In another study, prothrombin concentrates and recombinant factor VIIA were added in vitro to plasma from healthy volunteers receiving Rivaroxaban and Dabigatran with (partial) reversal of these agents [21].

Another concern regarding the utilization of DOACs in patients with liver disease is that the abnormal functioning of liver may affect the pharmacodynamics (PD) and pharmacokinetics (PK) of DOACs resulting in altered half-life and serum concentration of these agents in this patient population. As a result, caution and dose adjustment may be required when using DOACs in patients with abnormal liver function. Graff J et al. [22] observed that in patients with moderately impaired liver function (i.e. Child-Pugh classification B), the area under the plasma concentration-time curve (AUC) of Rivaroxaban after a single dose of 10 mg increased by 2.27-fold along with increase in factor Xa inhibition [22]. Since, Rivaroxaban is also excreted mainly by the kidneys (66%) and liver (34%), caution and dose adjustment of this agent is recommended in cirrhotic patients with cirrhosis with or without concomitant renal failure [22]. Rivaroxaban is also contraindicated in patients with liver cirrhosis associated with coagulopathy, increased bleeding risk, and patients classified as Child-Pugh B and C [22]. In contrast, the AUC of Dabigatran after a single dose of 150 mg decreased by 5.6 % in patients with moderately impaired liver function (i.e. Child-Pugh classification B) [22]. Also, Dabigatran is mainly (80%) eliminated via the kidneys and is likely the more safer choice in patients with liver cirrhosis [23]. Stangier J et al. [23] observed slower conversion of Dabigatran intermediate to active Dabigatran [24]. However, total drug exposure was comparable between healthy volunteers (n = 12) and patients with hepatic impairment (Child Pugh classification B, n = 12) [24]. Moreover, the parameters of coagulation, including activated partial thrmboplastin time, clotting time, and thrombin time relationships were basically similar in both groups [24]. Therefore, Dabigatran can be used in patients with moderate hepatic impairment without the need for dose adjustment [24]. Dabigatran should be avoided in patients with elevated hepatic enzymes (>2× ULN) and is contraindicated in patients with hepatic impairment expected to have any impact on survival [22]. Increased AUC by 1.09-fold was observed for Apixaban after a single dose administration of 5 mg whereas AUC of Edoxaban decreased by 5.6 % after single dose administration of 15 mg [22]. In patients with mild (Child-Pugh A) or moderate (Child- Pugh B) hepatic dysfunction or transaminase levels >2× upper limit of normal (ULN), Apixaban can be used with caution. Apixaban should be avoided in patients with severe hepatic impairment and in those with hepatic impairment with increased bleeding risk [22].

It will be helpful to monitor activity of DOACs in cirrhotic patients particularly at increased risk of bleeding for example patients with esophageal varices, elevated INR, and low platelet count. Novel coagulation assays need to be developed to monitor the activity of DOACs in serum from patients with liver disease. In one study, Potze W et al. [25] noticed substantial reduction in anti- Xa levels when antithrombin (AT) dependent anticoagulant drugs (Unfractioned heparin, LMWH, and fondaparinux) were added to the plasma of patients with cirrhosis as compared to plasma from healthy controls. Therefore, they concluded that anti-Xa assay cannot be used to monitor AT-dependent anticoagulant drugs in patients with cirrhosis, as it may result in underestimation of drug levels and increase risk of bleeding. However, this was not the case with Rivaroxaban and Dabigatran and they recommended that direct factor Xa and IIa inhibitors may be monitored through the respective anti-Xa and anti-IIa assays in patients with cirrhosis.

Read More About  Lupine Publishers Journal of Gastroenterology and Hepatology Please Click on Below Link: https://currenttrendsingastroenterology.blogspot.com