Department of Informatics and Enabling Technologies

Permalink for this collection

The Department of Informatics and Enabling Technologies involves using computing technology to solve real world problems.

Current interests include: Visualisation, Computer Graphics, Image Processing, Data modelling and management, Simulation and Modelling, and Computing and Education.

Browse

Recent Submissions

  • PublicationOpen Access
    Using accounting information systems to benefit micro businesses : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2024) Benbow, Pamela
    Ninety percent of all businesses in New Zealand are micro businesses, defined as having zero to five employees. This sector is critical to New Zealand’s economy. Micro businesses create opportunities for new entrepreneurial talents, provide employment and offer consumers choice and variety including specialist goods and services. Central to all businesses is the need for information, managed by the accounting information system (AIS). The AIS supports decision-making, achieving business objectives and managing limited resources. Prior studies and government reports call for further research of micro businesses so that this sector of the economy can be strengthened. This research addresses this call by exploring the benefits of using AIS in micro businesses using multiple methods, including desk-based research, semi-structured interviews with professional accountants, a survey of micro business and finally semi-structured interviews of micro business owners. Findings show that a variety of tools are used, ranging from manual record keeping, to spreadsheets, to computerised AIS, and including a mixture of these tools. The majority of microbusinesses use computerised AIS tools, of which two software providers dominate. Some accounting firms specialise their practice either through industry or choice of AIS. Other accountants accommodate any AIS approach, focusing on the individual micro business needs. AIS use by micro businesses is primarily focused on monitoring cash flow, sales and income activities and compliance reporting (GST and income tax). The greatest utilisation of computerised AIS and add-on tools are observed with these activities. Micro businesses could utilise other features more, especially reporting, as a basis for decision-making. The decision to adopt computerised AIS includes factors affecting the individual business owner (generation, individual knowledge and skill and personal attitude to technology), internal business factors (financial costs, time costs and the business purpose and future) and external business factors (supply chain, regulatory bodies and supporting services). The benefits of using computerised AIS include connectivity, autofill, automated calculations and drilldown. Connectivity through cloud technology provides accessibility to a single version of the data between users regardless of location. Autofill populates data entry screens with information previously captured, reducing the need for typing. Automated calculations automatically completes basic arithmetic in the creation of invoices, supplier bills and reports. Finally, drilldown enables direct access to supporting detail for information provided on screen. These benefits may not be available in older versions of computerised AIS, or versions that only include a subset of the features. This research increases the understanding of factors impacting micro businesses in their decision to implement computerised AIS, and the benefits from doing so. The findings support accountants, government agencies and AIS software developers to devise strategies to support micro businesses. Findings from this research are applicable to micro businesses throughout New Zealand and more globally and will benefit other small businesses outside of the micro definition, both locally and globally.
  • PublicationOpen Access
    Body composition estimation in breeding ewes using live weight and body parameters utilising image analysis : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2023) Shalaldeh, Ahmad
    Farmers are continually looking for new reliable, objective and non-invasive methods for estimating ewe body condition. Live weight (LW) in combination with body condition score (BCS) are used by farmers as a basis to determine the condition of the animal. Where LW is a crucial indicator of body composition, body condition can be evaluated by determining the amount of fat in the animal. This amount plays a key role in ewes’ health condition and animal productivity. The body condition score is used to monitor animals to ensure the best condition and is a measure between 1 (low condition) and 5 (high condition). If an ewe has a condition below 2 this is considered poor, whereas above 3 is regarded as good condition and ready for breeding. The current method is subjective (relies on professional judgment from farm handlers) as such it can introduce an element of error when estimating the fat, which makes it difficult to monitor the animal condition. A quick, objective, and accurate method of body composition estimation is required to improve farm management. If such a method could be devised, many farmers around the world would utilize it to assist in managing sheep farms. In addition, image processing and body measurements have not been used before to estimate body composition for ewes during the production cycle using a comprehensive, repeatable, non-invasive method. Image processing and body parameter measurements have been widely used to estimate ewe body size and weight. The objective of this thesis was to establish a relationship between body parameters of body length, width, depth and height as independent variables and body fat, lean, bone and carcass (total weight of body fat, lean and bone) as dependent variables. The aim was to use these easily obtained body parameters to predict body composition. Two full experiments at weaning and pre-mating were conducted to establish the relationship between body composition and body parameters using measurements automatically determined by an image processing application at Lincoln University sheep farm for 88 Coopworth ewes. Computerised Tomography (CT) technology was used as a benchmark to validate the predicted body composition. A trial run, wool test, uncertainty test, repeat test and carcass test were also conducted to minimise uncertainty and test the experiment setup. The image processing application used techniques from OpenCV library such as image extraction, convert to HSV colour, erode, dilate and smooth filter to remove the ewe’s head, legs (for side image) and extract the body to calculate the body parameters in an automated method. Multivariate linear regression (MLR), artificial neural network (ANNs) and regression tree (RT) statistical analysis methods were used to analyse the relationship between independent and dependent variables to predict body fat, lean, bone and carcass. The artificial neural network method was found to be the best method to show how much variance of the dependent variables is explained by a set of independent variables. The result showed a correlation between fat, lean, bone and carcass weight determined by CT and the fat, lean, bone, carcass weight and percentage of fat–carcass weight estimated by live weight and body parameters calculated in an automated method using the image processing application with R2 values of 0.88, 0.85, 0.72, 0.97 and 0.94, respectively for the training data of 138 ewes with a root mean square error (RMSE) less than 2.5. A new set test data of was used to test the accuracy of the results of multivariate linear regression, neural networks and regression tree. The neural networks model provided the highest R2 for total fat prediction with R2=0.90 and RMSE=1.01 with a maximum difference of 2.7 kg and a minimum difference of 0.018 kg between the predicted value and the actual value, lean prediction with an R2 of 0.72 and RMSE=1.03, bone prediction with an R2 of 0.50 and RMSE=1.21, carcass prediction with an R2 of 0.95 and RMSE=1.31 and percentage of fat – carcass weight with an R2 of 0.90 and RMSE=1.30, respectively. ANNs also showed the lowest RMSE for fat with a value of 1.01 and for carcass with a value of 1.31. The image processing application calculations showed an uncertainty of -9.43 to 9.22 mm for chest width. The result also showed that many ewes had the same body condition score but different fat and chest widths, which confirmed that the body condition score may not provide an accurate indication of fat. The results showed an optimal fat of 9.37% of LW for ewes during the production cycle. If the percentage of fat is less than or more than 9.37%, farmers must take action to improve the conditions of the animals to ensure the best performance during weaning and ewe and lamb survival during the next lambing. The new method can be used to determine body composition on sheep farms as an alternative to BCS since it showed more accurate results. This method can also lead to the use of new image analysis technologies and more research on using image processing on-farms. The accuracy of the new method is slightly less than CT but it takes less time and cost than the CT. It can be used on-farm at any stage during ewe production cycle and can be applied on animals with wool or after shearing the wool.
  • PublicationOpen Access
    Implementing Multi Agent Systems (MAS)-based trust and reputation in smart IoT environments : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2022) Al-Shamaileh, Mohammad
    The Internet of Things (IoT) provides advanced services by interconnecting a huge number of heterogeneous smart things (virtual or physical devices) through existing interoperable information and communication technologies. As IoT devices become more intelligent, they will have the ability to communicate and cooperate with each other. In doing so, enormous amount of sensitive data will flow within the network such as a credit card information, medical data, factory details, pictures and videos. With sensitive data flowing through the network, privacy becomes one of most important issues facing IoT. Studies of data sensitivity and privacy indicate the importance of evaluating the trustworthiness of IoT participants to maximize the satisfaction and the performance of the IoT applications. It is also important to maintain successful collaboration between the devices deployed in the network and ensure all devices operate in a trustworthy manner. This research aims to determine: How to select the best service provider in an IoT environment based on the trustworthiness and the reputation of the service provider? To achieve this, we proposed an IoT agent-based decentralized trust and reputation model IoT-CADM (Comprehensive Agent-based Decision-making Model for IoT) to select the best service providers for a particular service based on multi-context quality of services. IoT-CADM as a novel trust and reputation model, is developed for the smart multi-agent IoT environment to gather information from entities and score them using a new trust and reputation scoring mechanism. IoT-CADM aims to ensure that the service consumers are serviced by the best service providers in the IoT environment which in turn maximizes the service consumers’ satisfaction, which lead the IoT entities to operate and make-decisions on behalf of its owner in a trustworthy manner. To evaluate the performance of the proposed model against some other well-known models like ReGreT, SIoT, and R-D-C, we implemented a scenario based on the SIPOC Supply Chain approach developed using an agent development framework called JADE. This research used the TOPSIS approach to compare and rank the performance of these models based on different parameters that have been chosen carefully for fair comparison. The TOPSIS result confirmed that the proposed IoT-CADM has the highest performance. In addition, the model can be tuned to its parameters weight to adapt to varying scenarios in honest and dishonest agents’ environments.
  • PublicationEmbargo
    Holistic Boolean model of cell cycle and investigation of related diseases through perturbation studies : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2022) Pasha, Mustafa Kamal
    The cell cycle is the mechanism by which organisms develop and grow by cell division where a mother cell produces two daughter cells with exact copies of DNA. In the past few decades, much progress has been made in the field of systems biology in studying the complexity of the molecular regulation of cell cycle. However, most recent computational models have focused only on fewer aspects of the cell cycle because of its challenging complexity. Specifically, some core regulatory processes involved in DNA replication are modelled in most studies. However, a very limited attempt has been made to model the other crucial aspect – consistent volume growth during cell cycle to accommodate two sets of DNA and produce two daughter cells. Volume is particularly important because a number of debilitating diseases, including cancer, Alzheimer’s, Parkinson’s, and Down’s syndrome have been attributed to the aberrant cell cycle due to volume dysregulation. DNA replication and volume growth are highly regulated concurrent processes in the cell cycle. This study proposes to develop a holistic computational model of G1/S phase of cell cycle, integrating volume and DNA replication processes in a temporal Boolean model to gain insights into the mammalian cell cycle more comprehensively. It contributes towards the first most comprehensive cell cycle model integrating volume. Additionally, it explores the robustness of cell cycle from a perspective of integrated operations with multiple processes involved in cell division. It also explores the robustness of the cell cycle against single mutations. Further, this study probes into the causes of cell cycle diseases (i.e., volume, neurodegenerative, and cancers) and potential avenues for their understanding, elimination and therapeutics. These aspects along with temporal Boolean modelling are major novel contributions in the proposed study. The cell cycle consists of four main phases, G1, S, G2, and M, representing two Growth phases (G1 and G2) and DNA Synthesis (S) and Mitosis (M) or DNA segregation phases. This study focuses on G1 phase where the cell accomplishes the first volume growth and prepares the conditions (Cyclins and proteins) necessary for DNA synthesis in S phase. Our investigation revealed that these two processes are tightly interlinked and concurrently regulated in G1. The proposed model captures these features to accurately represent this phase of the cell cycle. We focus only on the G1 phase primarily because we realised that closer attention to G1 is needed to bring a clearer picture of some of the crucial aspects of this interlinking that we have uncovered. A cell maintains its volume within close bounds in normal operation and doubles its size in the cell cycle. Volume is increased through osmosis and ion channel operations that bring water into the cell from outside due to gradients in ionic concentrations. Therefore, in cell volume growth, a cell adjusts ion gradients through the operation of a large number of ion channels located on its plasma membrane, which also signifies the important role of bio-electricity (membrane polarisation) in cell cycle. In particular, cell continuously adjusts membrane polarisation to activate the required ion channels throughout volume regulation. Studies found that a large number of ion channels involved in volume regulation in G1 are associated in normal and cancerous cell proliferation. This implication of volume involvement is another reason to keep our focus on G1. Further, during large-scale volume changes, a cell concurrently reorganises its cytoskeleton (CS) to accommodate volume growth. This is achieved primarily through elevated Ca+2 which is established early in G1 phase through the K+ mediated hyperpolarisation of membrane (Vmem) that activates Ca+2 channels to bring Ca+2 into the cell. Ca+2 depolymerises the cytoskeleton and further contributes to increasing Vmem required to activate Cl− channels. This changes the ionic gradient causing the efflux of water through Aquaporin (AQ) and Taurine channels leading to shrinkage of the cell. The purpose of cell shrinkage appears to be primarily to relax the cytoskeleton before swelling. Once the shrinkage has stopped, the swelling process starts through the operation of Cl− and Osmolyte channels activated through increased Vmem for water influx and concurrent repolymerisation of CS that cause the cell to swell. Swelling is stopped upon reaching a volume threshold sensed by sensor protein mTorC1. This sensing defines the volume checkpoint in cell cycle. Ca+2 is a crucial player in controlling and linking both volume regulation and preparation of machinery for DNA replication. Specifically, while contributing to regulation of cell volume as described above, Ca+2 also plays a concurrent role in initiating the DNA replication machinery by activating Immediate Early Genes (IEG) that leads to the production of the first cell cycle Cyclin, Cyclin D. The preparation of the machinery for DNA replication mainly refers to the preparation of Cyclins required for DNA synthesis that takes place in S phase. Cyclins are the drivers of DNA replication, which themselves are tightly regulated by the cell itself through production and degradation processes. The production of Cyclins happens in G1 where they control the transition from G1 to S phase. G1 phase is important for Cyclin production, preparing three Cyclins, Cyclin D, E and A. Among them, Cyclin D is needed to partially release E2F transcription factor which is needed for the production of Cyclin E and A. The first cell cycle Cyclin, Cyclin D, hypophosphorylates Retinoblastoma protein (Rb) bound to partially release E2F for CycE production. This process marks the first checkpoint of the system called Rc, in our proposed model. This and the volume checkpoint are two major checkpoints introduced in our model. The Volume and DNA-replication-related processes further coincide during cell’s passage through Rg-volume checkpoint. Passing of the Rg coincides with complete release of E2F by Cyclin D and Cyclin E. The complete release of E2F factor for Cyclin E synthesis in bulk is done by Cyclin D and E together. During this process, the volume sensor protein, mTorC1, after having ensured that the cell passes Rg, then plays a key role in helping to complete the full release of E2F from Rb to aid bulk CycE synthesis and subsequently CycA. The role of Cyclin E is to assemble the DNA replication machinery on the DNA in late G1; therefore, adequate preparation of CycE signifies the transition from G1 to S phase. We introduce subsystems to integrate the volume regulatory processes, for example, membrane polarisation, CS adjustment, Ca+2, and checkpoints with DNA replicationrelated processes in a holistic system of G1 phase and represent the system in a temporal Boolean model. In particular, our investigation of the literature revealed the two G1 checkpoints mentioned above. One to ensure adequate cell growth (Rg) and the other to ensure the readiness for preparation of Cyclin E (Rc). Only the latter checkpoint Rc has been studied in past computational models and we show in our model how these two checkpoints are operated integrally. This constitutes another important contribution of the study. The goal of the study was to develop and study G1/S network as a holistic system with multiple subsystems. For this, we curated a temporal Boolean core regulatory model of G1 phase of cell cycle with 34 nodes and 42 Boolean Eqs., simplified from over 100 elements. The network model contained six subsystems: signal initiation, Calcium establishment, volume regulation, cytoskeletal regulations, Cyclin synthesis, and checkpoints. The model was implemented on MATLAB. An important aspect of our model is that it incorporates realistic times for the activation and operation of proteins extracted from an extensive literature survey. This gives the model temporal sense while avoiding spurious trajectories commonly found in Boolean models with random asynchronous updates of protein states. This marks another important contribution of this research as other existing Boolean cell cycle models lack time stamps while representing only the DNA-related process. We conducted a comprehensive study of the model to answer a number of questions: (i) Does the model resemble reality? We built the model from an extensive and exhaustive literature survey from which we distilled information for rigorous model building. Still, it is important to assess the correctness of model logic and how well the model represents reality in order to gain valuable insights from the model about the holistic operation of cell cycle and to ensure that the model is realistic to study its response to various perturbations. The model simulation reveals the seamless operation of the subsystems to accomplish the G1/S transfer. Specifically, it correctly unfolds how Vmem, CS, and Ca+2 regulate volume and how volume regulatory processes collaborate with the DNA replication-related processes and how these two strands of activities intricately control the two checkpoints. (ii) How robust is cell cycle design and how vulnerable it is to mutations? We conducted a comprehensive robustness study covering a number of investigations: a) Impact of mutation through element perturbations (Knock on/off), to identify elements and subsystems that crucially impact the main processes of G1: volume regulation, CS and Ca+2, membrane polarisation (MP), checkpoints and G1/S transfer. We found that each subsystem works independently and collaboratively towards the achievement of the G1/S goal. Any failure in one subsystem would either halt the cell cycle progression at various points in G1, or would lead towards cell death. We further found that the minimum working requirement for a subsystem is to achieve its local goal, i.e. to activate the neighbouring subsystem. Moreover, each subsystem has more than one element which can cause total system failure. b) What effects mutations have on volume related diseases, i.e, Alzheimer’s (AD), Parkinson’s (PD), Down’s Syndrome (DS) and other related diseases. We found that components of a sub-system of a larger network can be manipulated to effectively study the impact on the overall disease development. We confirmed through mutational studies that there are components available in the subsystem which can potentially be exploited to stop disease progression, or even eradicate if detected during the early onset of disease. More specifically, the Calcium and volume systems play a role in these disease progression. c) What are the most crucial elements responsible for cancer development and cell cycle-related diseases including, AD, PD and DS. The results showed that Cyclins, checkpoints (including volume checkpoint = Rg), and a few individual nodes from swelling, Calcium, and ion channels can contribute towards cancer development and other major cell cycle-related diseases and therefore be potential interests. These add a wealth of new information to the literature. Further to this, other results are in concert with the existing literature. This first comprehensive G1 model of the cell cycle is a major contribution to cell cycle modelling with its holistic coverage. It has made novel contributions by integrating volume and DNA-related processes and probing the model for cell cycle robustness against mutations, temporal sensitivity, and gaining insights into cancers, other diseases, and system failures. Major Contributions: 1. Extensive Literature Review (over 300) including over 90 Computational Studies 2. Multi-level regulatory Network Building, capturing concurrent regulation of volume increase and preparation of the crucial drivers (Cyclins) of DNA replication 3. Inclusion of volume Sensing, and introduction of the volume checkpoint Rg, and its integration with Cyclin Synthesis Checkpoint Rc 4. Flexible Boolean Logic Synthesis for a Comprehensive representation of the Complex and Dynamic G1/S System 5. Inclusion of Temporal data from in-vitro Expression Studies for Transforming the Standard Boolean Model into a Temporal Boolean Model 6. Investigational Studies via Perturbations on Cell Cycle Robustness, Cancer Development and Cell Cycle Related Diseases
  • PublicationRestricted
    Study of learning in small groups with an emphasis on facilitating effective learning in small groups in university programmes: A dissertation submitted in partial fulfilment of the requirements for the degree of Master of Applied Science
    (Lincoln University, 1993) Keown, A. M.
    A study of student expectations and perceptions of learning in small groups in the university was carried out. A class of senior students was approached, and volunteers sought. The students were interviewed, and the interviews were recorded on audio cassette. They also completed a questionnaire giving their demographic details. The students identity remained anonymous in the analysis of their replies. The students had clear expectations of the leader and member roles within a group. They expected the leader to define the task, suggest solution for completing the task and hold the group together. They expected the members to fully contribute and participate in the group activity. Not surprisingly, their experience of working in small groups was similar to their expectations. The leaders role was as they expected, except in the leaderless groups that some of the students were involved in, where the leadership role was not required. The students could identify the task and maintenance needs of the group, but they had no perception of either their own individual needs, or the needs of the other individuals in the group. This showed that the students did not have any perception of the group processes working within their groups.
  • PublicationOpen Access
    Designing and testing holistic computational frameworks for identification of the most effective vaccine and drug targets against human and bovine tuberculosis : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2021) Pawar, Pooja
    The World Health Organization has considered tuberculosis (TB) a threat with a significant mortality and morbidity rate worldwide. TB is caused by notorious Mycobacterium tuberculosis, which has evolved with successful survival strategies leading to the emergence of drug-resistant TB strains making drugs (first-line TB drugs) and vaccine (BCG) ineffective. The global emergence of tuberculosis is threatening to make one of humankind’s most lethal infectious diseases incurable, with an estimated 10.0 million new TB cases and 1.4 million deaths in 2019. Further, TB affects animals too; bovine tuberculosis primarily affects cattle and it is caused by the etiological agent Mycobacterium bovis. Twenty to thirty per cent of the global livestock population is potentially affected by bovine TB, leading to annual economic losses of more than USD 3 billion globally (Kuria, 2019). This study conducts an in-depth investigation into pathogen-human interactions to gain deeper insights into the evolution of pathogen and their drug resistance mechanisms and uses this understanding to provide potential solutions for effective vaccine and drug development for human and bovine tuberculosis. Our research begins with gaining an understanding of the pathogenesis of human and bovine TB, the interaction of TB bacteria with its host, the host defence mechanism, bacterial survival strategies in evading the host immune response, and in-depth knowledge of the mechanisms of TB drug resistance. The current drug treatment regimen has not changed in nearly 40 years. Although the first-line drugs play a pivotal role in combating TB, the emergence of resistant TB strains due to different survival mechanisms of TB bacteria such as reduced permeability of cell wall preventing drug entry into the cells, mutations in the drug target protein (major hurdle in TB treatment), inactivation of drug molecules with the help of bacterial enzymes, and a transmembrane drug efflux system to expel the drug out from the bacterial cell has heightened the burden of TB globally. BCG is the only licensed vaccine available and has been around for almost a hundred years. BCG (Bacillus Calmette-Guérin) is prepared from a live-attenuated strain of Mycobacterium bovis and it has shown protection in babies and young children. The inefficiency of BCG in not reducing the prevalence of disease and not protecting adults is so far not understood. Some of the crucial factors might include Mycobacterium bovis is less virulent and not a primary causative agent of TB, diversity in TB strains and over-attenuation of presently used BCG strain. The low efficacy of BCG, the emergence of the drug-resistant Mycobacterium tuberculosis strains, and challenges in developing drugs and vaccines have generated an urgent requirement for a powerful and effective therapeutic approach for TB treatment. This study introduces three holistic strategies/frameworks for developing new and effective therapeutic methods for fighting TB.
  • PublicationOpen Access
    Characterising sheep vocals using a machine learning algorithm : A thesis submitted in partial fulfilment of the requirements for the Degree of Master of Applied Science at Lincoln University
    (Lincoln University, 2021) Kayani, Bilal Nawaz
    New Zealand’s economy is mainly dependent on the farming sector and the sheep sector is one of the most important farming sectors, playing a backbone role to the agricultural industry and placing New Zealand among the top five sheep exporter countries in the world. International consumer trends show concerns over the well-being of animals before slaughter and research also indicates potential negative effects on meat quality of stressed animals. Indicators for sheep well-being have largely been limited to physical weight gain and visually observable behaviour and appearance. There has been recent interest but little substantive research on sheep vocalisation as a means of monitoring sheep well-being. This assumes that sheep vocalisation can be classified as representing different states of well-being. Therefore, this thesis investigated the potential to be able to classify sheep vocalisations in a way that would enable automated assessment of the well-being of New Zealand sheep using recorded vocalisations. A supervised machine learning approach was used to classify the sheep vocals into happy and unhappy classes. Sheep sounds were collected from a New Zealand Ryeland sheep stud farm and online databases. After collection, these sounds were labelled by an expert, pre-processed to make them clean from unwanted background sound noises and features were extracted and selected for classification. Models were built and trained and tested. Model use in this research shows that sheep sounds were classified into happy and unhappy classes with an accuracy of 87.5%, for the sheep vocals used in this research. Through demonstrating the ability for automated classification of sheep vocalisations this research opens the door for further study on the well-being of sheep through their vocalisations. Future researchers could also collect larger vocal data sets across different breeds to test for breed-related variance in vocalisations.. This may enable future sheep well-being certification systems to be established to assure consumers of the well-being of pre-slaughter sheep life.
  • PublicationOpen Access
    Modelling water allocation in community irrigation using multi-agent system : A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy at Lincoln University
    (Lincoln University, 2021) Chiewchan, Kitti
    Insufficient water for irrigation is a common problem in New Zealand, particularly in the Canterbury region, where the use and demand have been steadily increasing over the past 20 years (PCE, 2004; The Parliamentary Commissioner for the Environment, 2004). As a limited resource, there are restrictions around its use. While farmers who need water for irrigation can apply for consent through Environment Canterbury, the process takes a long time and is expensive. As a result, only those with large farms or those who will be able to realise greater financial benefits and higher levels of productivity tend to apply. Instead, most farmers apply to a community irrigation scheme such as Central Plain Water Limited (CPWL) who sells water to individual farmers. As a farmer must pay for each unit of water that s/he uses, s/he needs to have a good irrigation plan in place to ensure they obtain the maximum profit from their investment. In New Zealand, most farmers use computer programmes to estimate their irrigation requirements. The two most common programmes in New Zealand are IrriCalc and OVERSEER. However, both have some limitations: they can only be used to calculate the water needs of an individual farm and neither can prioritise crop water needs during periods of water scarcity. To deal with this problem, we designed an agent-based irrigation management system that can be used to optimise water allocation around the farm which is particularly useful during periods of water scarcity by taking into account the crop types and prioritising them based on the crop utility value. As it calculates the water savings based on each crop’s growth stage and prioritises it in terms of its potential sales price, this agent-based system provides a way to increase farmers’ profitability and to enables them to thrive during periods of water scarcity. During the water reduction exercise, most farms suffer from water shortages. However, there are farmers (who may have overestimated their water needs) who will have excess water. Recognising this situation, we developed a multi-agent system to improve water allocation within a community of water users (where each individual agent represents a farm) and investigated the efficiency of water distribution mechanisms among farms. Farmers can use the proposed multi-agent water management system to negotiate with each other to buy and sell water among themselves. One of the most well-known and simplest methods to achieve this is by using an auction. The choice of an auction was deliberate as it allows agents to buy water at a price, they are comfortable with. An agent must consider how much they are willing to pay for a specific volume of water to ensure their farm remains profitable. This study considered three-auction types and compared the results of each auction in terms of fair water distribution, profit for the sellers and reductions in losses for bidders. We found that the pay-per-bid auctions (discriminatory and uniform) are the best strategies for water distribution that balance between water distribution and gaining profit in water community. In addition, we also investigated how varying behaviours of sellers and buyer affect the outcome of the auction.
  • PublicationRestricted
    Optical tracking for the ROBOTable project : A dissertation submitted in partial fulfilment of the requirements for the Degree of Bachelor of Applied Computing with Honours at Lincoln University
    (Lincoln University, 2004) Pattie, Carl
    The ROBOTable is a new approach to facilitate distance learning of engineering concepts for schoolchildren being developed by the Centre for Engineering Education Outreach at Tufts University, Boston. The table has a computer image back-projected onto it from underneath. Students will create and test Lego™ robots on the table surface and compete with other children at similar tables in remote locations. An optical tracking system is implemented to track a robot on the table surface and provide information about its location and orientation to a remote table. The system identifies markers placed on top of the robots. Markers could be identified at rates up to 15 Hz depending on the image complexity and lighting conditions. A marker's position is accurate to within 1.4% of the size of the table surface. For convenient development of applications; the tracking system was integrated in to Robolab™.
  • PublicationRestricted
    A LabView tool for creating electronic activity cards : A dissertation submitted in partial fulfilment of the requirements for the Degree of Bachelor of Applied Computing with Honours at Lincoln University
    (Lincoln University, 2004) Oliver, Craig
    Traditional Activity Cards consist of a set of cards which contains a series of iterative steps designed to provide students with an independent learning environment (Daily Activity Card Archive, n.d). Electronic Activity Cards are based on the format of traditional Activity Cards but provide rich media such as videos, audio, and interactive content to further enhance the students' learning. This dissertation describes a system that has been developed in Lab View™ to simplify the creation of electronic Activity Cards which can be viewed in LabView™ and RoboLab™. The system provides a higher level of abstraction from standard Lab View™ / RoboLab™ functionality making it easier for a developer who has minimal knowledge of these environments to create an Activity Card.
  • PublicationRestricted
    A management simulation tool for modelling equipment reliability : A dissertation in partial fulfilment of the requirements for the Bachelor in Agricultural Science (Hons) in Lincoln College
    (Lincoln College, University of Canterbury, 1988) James, M. L.
    The Lincoln College Centre for Computing and Biometrics (CCB) administers and maintains much electronic equipment, including a sizeable pool of networked and stand-alone IBM (PC) XT & AT compatible microcomputers. As the College develops and grows, it is envisaged that this pool of computers will be expanded to keep pace with student and staff demand. Besides purchase and installation costs, the CCB incurrs on the College an ongoing cost in funding an infrastructure required to maintain these machines. This funding pays for resources such as support staff, consumables, replacement,components, as well as various other costs. Assuming that funding represents a constraint on maintenance of this computing facility, the question that faces the manager is exactly what amount of resources are required to ensure an optimum maintenance policy covering this equipment. One way to help answer the manager's question is to model the reliability of the equipment under question, simulating the occurrence of equipment faults according to defined probabilities. Such a model would also simulate the activities of staff responsible for maintaining the facility. This paper reports on preliminary investigations into the process of maintaining the population of PCs administered by the CCB. An analysis was made of records kept by the CCB concerning PC maintenance. A survey was also conducted on the wider aspects of supporting such a facility, focusing on the frequency and origins of events requiring the action of support staff. This paper also reports on the implementation of a partially developed discrete event driven simulation, designed to go some way to answering the manager's question above. Finally, an assessment is made of the directions for further study that could be made if the simulation was to be further developed into a useful management tool. An important aim in the preparation of this report is to impart the information necessary to allow further development of the simulation by a person with a previously limited knowledge of it .
  • PublicationRestricted
    Issues surrounding client-server computing: A dissertation submitted in partial fulfilment of the requirements for the Degree of Bachelor of Commerce (Honours) in Lincoln University
    (Lincoln University, 1993) Kong, Yen Teck
    Client-server computing has radically changed the way information is processed. The basic concept of client-server computing is the splitting of the application logic and the physical location of the data. Some of the participate computers in a client-server system are called client, the others are called server. Both the client and the server are assigned tasks (eg. data management, user interface) that they are best suited to perform. As client-server computing is a relatively new technology, it has frequently being misunderstood by people. There are still many issues regarding client-server computing which are not well explored. Those issues must be examined while the technology is staging into maturity. Client-server computing has become a very important technology due to the benefits it has promised to deliver. Those benefits include a reduction in network traffic, improve access to corporate data, and better user interface. A client-server database management system was implemented using Microsoft Access as a client application and Rdb/VMS as the server database. Some issues surrounding client-server computing were investigated using the implemented system.
  • PublicationRestricted
    An exploratory study on business use of the Internet in New Zealand : A dissertation submitted in partial fulfilment of the requirements for the degree of Bachelor of Commerce ( Honours)
    (Lincoln University, 1995) Lim, Wei Liang Leon
    There is a lack of current research on the subject of the internet for business. Most of the articles are from trade publications and popular books, and simply promote the use of the internet for business. There is even less information on the use of the internet by New Zealand businesses. This study was conducted to investigate the current use of the internet in New Zealand. It examined differences between claims made in the literature and existing perceptions and practices of New Zealand businesses already using the internet. The study also looked at the perceived benefits and problems, especially in the areas of marketing and advertising. A questionnaire survey and follow up telephone interviews were used to obtain data from companies with access to the Internet. Results from the study indicated that in New Zealand, smaller and/or computer technology focused companies may use the internet more extensively than any other group and may obtain more benefits from its use. The findings also revealed that the low number of potential customers and/or suppliers online and unreliable connections are major concerns for the companies surveyed. Directions for future research were also discussed.
  • PublicationRestricted
    Interactive modelling of shape preserving design curves : A dissertation submitted in partial fulfilment of the degree of Bachelor of Commerce and Management (Honours)
    (Lincoln University, 1997) Seymour, Chris Ian
    Computer aided design is a very widely used tool in industrial design and has been applied to such areas as car, airplane and boat design, to name a few. What is required as a result are applications that allow a designer to interact with a design. Designers want to be able to change, in real time, the shape of curves and surfaces while maintaining desirable properties such as smoothness or locality of a change. This project investigates a scheme that places a "shape preserving" curve through a set of datapoints. The default curve that is obtained is a rational piecewise cubic curve represented in Bezier form. From the scheme used to produce our default curve we derive four methods that can be used to make changes to the curve while maintaining a predefined level of smoothness and maintaining the shape preserving properties. The "Bezier Curve Manipulator" is a tool, written using OpenGL 1. 1, which is used to interactively make changes to a segment using one of the four movement methods mentioned above. This program allows designers to see, in real time, the change to the shape of the curve while maintaining a desired level of smoothness. A change is made by direct manipulation of a Bezier control point, or a weight value associated with each segment. As well as maintaining a smooth curve, other factors such as the locality of the change and the predictability of the change are considered. It is the effect on the shape of the curve using these four methods that is discussed with relevance to those criteria.
  • PublicationRestricted
    A tool for dynamic exploration of primers : A dissertation submitted in partial fulfilment of the requirements for the Degree of Bachelor of Applied Computing with Honours at Lincoln University
    (Lincoln University, 2003) Rutherford, P.
    A program has been developed to assist scientists in the biological sciences in the selection of primers for the polymerase chain reaction (PCR). Primer3 is a tool that can suggest primers based on requirements. The traditional query process encourages strict criteria be placed on primers to limit the number of primers suggested and to keep them within an ideal range. However, too often these restrictions are too strict and users must reformulate their queries and try again. Furthermore, there is little guidance on how to adjust the query. Now, broad criteria can be submitted to Primer3 and our program will allow the user to refine the criteria. The program differentiates those primers relating to new criteria using filters. The user can interactively adjust the filters on the suggested primers to see the effect. Primers not meeting the filter criteria are retained to help guide the user to adjust the filter. Interactions provided support dynamic exploration of the data set through allowing rapid, iterative, and reversible changes to filter criteria. Detailed information about selected primers is displayed in a text window for visual inspection and to facilitate a final choice. This solution has been developed using the open-source, cross-platform packages Tcl/Tk and VTK.
  • PublicationRestricted
    A usability analysis of portable two-way radios : a dissertation submitted in partial fulfilment of the requirements for the degree of Bachelor of Applied Computing (Honours) at Lincoln University
    (Lincoln University, 2001) Frizzell, Hamish
    In this dissertation, the usability of the Tait Orca Eclipse and 5020 (SlimTOP) portable radios was evaluated. User trials were conducted with six users, chosen from a range of different backgrounds. The trials were designed to find the major usability problems; statistical significance of the results was not a priority. A set of usability criteria are suggested and used to analyse the trial results. The trials found several aspects of the interface which caused users some difficulty. For the majority of these aspects, the criteria can explain why these problems occurred. The results of the trials and the critical analysis were used to recommend some possible improvements to the user interfaces of the radios.
  • PublicationRestricted
    A further investigation into transforming spreadsheet data using XML : a dissertation submitted in partial fulfilment of the requirements for the degree of Bachelor of Applied Computing with Honours at Lincoln University, 2004
    (Lincoln University, 2004) Spray, Wendy
    Spreadsheets are often the first choice for storing data however when complex or large data sets are concerned spreadsheets are soon outgrown. The next logical step is to move the data into a database. When moving data from a spreadsheet to a database, problems arise when the data in the spreadsheet is held in a single "flat" table and the database design calls for the data to be stored in multiple tables. While there are many tools available to manage this transfer process, at present there is no one tool that can manage all the required transformations of the data. We investigate using Extensible Markup Language as the basis for creating a transformation tool that will suit all purposes. As a large number of recently developed office suite applications have been created to allow close integration with XML specifications, it has great potential as a universal medium for communication. The Microsoft.NET Framework is one of the applications that have been developed to support XML and is used in this research to create the data transfer application. There are two main ways to use this Framework to manage the transformation process. The first uses the XML classes inherent in the Framework to directly manipulate an XML file, the second uses ADO.NET. ADO.NET is a data access tool used to connect directly to a data source, enabling files other than those in XML format (e.g.: a file in xls format) to be used as the data source. While both alternatives are discussed in detail with coding examples, ADO.NET provides the greatest flexibility. The final application to perform the data transformation is created using the functionality provided by ADO.NET.
  • PublicationOpen Access
    Investigating the potential role of visualisation in natural resource decision-making
    (2017-01-01) Otinpong, Bernard; Charters, Stuart; McKinnon, Alan E.; Gidlow, Robert G. A.; Syme, G.; Hatton MacDonald, D.; Fulton, B.; Piantadosi, J.
    Computer-aided visualisation can be applied to natural environments to understand the impact of proposed developments or management strategies, but little evaluation of the effectiveness of these tools has been undertaken. In seeking to manage natural environments, it is desirable to model and understand these complex interactions in order to compare the outcomes of applying different management strategies. The purpose of this study was to investigate whether there are significant differences in knowledge outcomes depending on the form in which visualisation of environmental changes is presented, using a case study of Te Waihora/Lake Ellesmere, a broad, shallow lagoon in the South Island of New Zealand. Te Waihora/Lake Ellesmere is separated from the Pacific Ocean by the long narrow sandy Kaitorete Spit. Its unique position allows for it to be opened to the sea periodically to provide drainage and prevent flooding of surrounding farmlands. There is a lack of agreement among the diverse stakeholders regarding the appropriate levels at which the lake level should be maintained throughout the year. We describe an interactive visualisation tool (ElleVis) which shows the effects of different water levels on the flora and fauna, as well as plants and animals living in and around the Lake. The tool allows users to input different opening scenarios and visualise the resulting impact on water levels around the lake at various times. It incorporates historical rainfall data from New Zealand’s National Institute of Water and Atmospheric Research to deliver a graphical map display, including a summary table with a ‘traffic light’ status for lake values - birds, fish, farming and other stakeholder interests at different locations around the lake. The interactive nature of the ElleVis tool allows the stakeholders to compare Te Waihora/Lake Ellesmere under different opening scenarios using one interactive tool. However, it is possible, for example, that providing information about changes in lake behaviour in a carefully and clearly presented non-interactive form may be as successful as providing it in an interactive form of ElleVis. In order to test for the effect of interactive versus non-interactive forms of visualization, we conducted an experiment with forty participants (randomly assigned to two test groups) who have various interests at Te Waihora. We provided them with either an interactive or a non-interactive form of visualisation. Results were recorded from a structured interview after the test. The findings revealed that interactive visualisation was key to advantageous learning about changes in environmental behaviour. We argue that the techniques presented have the potential to stimulate meaningful discussions in natural resource situations that involve contested resources or a multiplicity of interests, but at the same time, there is an urgent need for evaluation of such tools in participatory decision-making processes.
  • PublicationRestricted
    An evaluation and comparison of the different approaches to deploying Windows NT and 32-bit applications to end-users: A dissertation submitted in partial fulfilment for the requirements for the degree of Bachelor of Commerce (Honours)
    (Lincoln University, 1997) Skelton, Adam
    As software developers continue to write 32-bit programs in order to take advantage of recent hardware developments, users are requesting 32-bit operating systems in order to run them. Network administrators are faced with the problem of deploying 32-bit operating systems, such as Windows NT over their enterprise networks. This investigation evaluates four common methods used to deploy Windows NT and 32-bit applications to end-users. It includes an evaluation of a new multi-access Windows NT server called Citrix WinFrame, as one of the deployment methods. The investigation discusses the trade offs between the performance advantages gained by having all the files stored on the local hard disk with the administrative ease of centralised fileservers.
  • PublicationRestricted
    An architecture for distributed multiplayer RoboTable games: A dissertation submitted in partial fulfillment of the requirements for the Degree of Bachelor Software and Information Technology with Honours
    (Lincoln University, 2008) Zhao, Kun
    Lego robotics has been introduced for teaching and learning as a learning tool. The RoboTable has been developed for Lego robotics as a tabletop learning environment. Learners can interact with the RoboTable using their Lego robot. Currently RoboTables are stand alone devices but if we can link RoboTables over a network, school children from different places could work together. This will create a better learning environment by letting school children learn from each other. This project focuses on investigating a universal, platform independent way to simplify the creating the network communication between RoboTable games. The proposed plan is to develop a communication toolkit to help RoboTable game developers create communication between multiplayer RoboTable games easily. This toolkit has essential communication development functionality to create connections and send and receive messages. These functions allow RoboTable game developers to easily add communication and create multiplayer RoboTable games. We implemented a prototype toolkit to support creating multiplayer RoboTable games as well as developed a central communication server for handling the communication between games. In addition we have created a sample game to test the toolkit and act as an example to help RoboTable game developers. The toolkit approach was evaluated by doing testing using the sample game. The results showed that using the toolkit is an easy and useful approach to allow RoboTable game developers to create multiplayer RoboTable games with correct game state using minimal effort. More importantly, the central communication server application can work with any type of multiplayer RoboTable games which means developers do not have to develop a server for each RoboTable game. This approach significantly reduces the effort required for developing multiplayer RoboTable games