Tuesday, May 26, 2020
Analysis Assignment Paper on The Bovis PLC - 275 Words
Analysis Assignment Paper on The Bovis PLC (Essay Sample) Content: Analysis of Bovis Homes PLCNameCourseProfessorUniversityDateTask 1a. mNet asset value in Bovis homes plc 1630Property, Plant and Equipment 50Debtors (10)Revised NAV in BP plc 1670b.1. Ke = rf + (rm-rf)When is 0.8 Ke=3%+ 0.8 (6%-3%) = 3%+0.8 x 3% = 5.4%When is 0.1 Ke=3%+ 0.1 (6%-3%) = 3%+0.1 x 3% = 3.3%2.3.Vd should be adjusted because the figure illustrated in balance sheet is par value, their market value should be 820p per share in terms of share price at 31 December 2016 in the LSE.Therefore,Ve=134*8.2=1098.8m poundsVd=170*90/100=153mKd=2.67%When ke is 5.4%When ke is 3.3%C.When Ke is 5.4%, g=0%, The theoretical price isWhen Ke is 5.4%, g=2%, The theoretical price isWhen Ke is 3.3%, g=0 the theoretical price isWhen Ke is 3.3%, g=2% the theoretical price isD.EPS in 2015 is 90pMarket price on 31 Dec ember 2015 is 1015pMarket price on 31 December 2016 is 820pThe P/E ratios for the year 2015 and 2016 are 11.28 and 9.11 respectively. The average P/E ratio for the companys industry is 12, which is higher than both of the P/E ratios. This implies that the market does not have a lot of confidence in Bovis Homes PLC future earnings. The P/E ratio has also decreased from 11.28 in the year 2015 to 9.11 in the year 2016. The ratio getting worse implies that investors are losing confidence in the ability of Bovis Homes PLC to sustain its current earnings in the future. The P/E ratios for Bovis Homes PLC in the two years show that other firms in the same industry as Bovis Homes PLC have a greater potential of getting higher earnings, or that they have a lesser risk.Task 2(a)Definition of P/EPrice to Earnings ratio is one of the most familiar common stock valuation methods and has been used since the early 20th century. P/E ratio is obtained by dividing the market price of a share (numerato r) with Earnings Per Share (denominator) (Koller, Goedhart and Wessels, 2010). There are several justifications that support P/E in the valuation of stocks. First, earning power is considered the main driver of investment value. EPS then becomes the primary focus of investors and security analysts as it shows the earning power of the firm. Second, P/E ratio is widely used and recognized by investors across the globe. Third, a difference in P/E ratio may be related to a difference in the average returns on investment in those stocks. These grounds make many investors decide P/E ratio as the ratio of choice when deciding on which security to invest in.However, selecting the figure of EPS which is appropriate is not as straightforward as the figure of the current market price of the stock. The first issue dealt with when selecting the EPS figure is time horizon over which earnings of a stock has been measured. The second issue is the modifications to accounting earnings that an analyst makes to compare P/Es of different companies.There are two types of P/Es: a trailing P/E and a forward P/E. A trailing P/E is also called a current P/E and is calculated by dividing the current market price of stock by the four most recent quarters of EPS. The EPS in this calculations is sometimes called trailing 12 month EPS (Campbell and Thompson, 2008).TrailingPE=Current Market Price Per SharePrevious 12 months EPSThe forward P/E is sometimes referred to as the prospective or leading P/E. It is calculated by dividing the current market price per share by the forecasted future earnings for the next 12 months EPS (Damodaran, 2012).FowardPE=Current market price per shareExpected EPS for the next yearThe advantage of using the forward P/E over the trailing P/E is that it helps investors make decisions based on the predicted performance of the company in the coming year rather than use historical data. The trailing P/E can be affected by uncertainties the earnings growth in the futur e. Hence, many investors prefer a prospective P/E as the valuation of a stock is a progressive process(Pinto, Henry, Robinson and Stowe, 2010). The trailing P/E may be used when it is difficult to predict the future earnings of a company or when the forecasted future earnings are deemed unreliable by an analyst.A higher P/E is preferred since it is an indicator that the earnings of a particular company are expected to grow in the future. Also, when analyzing a company in a certain industry, it is important to compare its EPS with the average EPS of the industry. If it has a higher EPS than the average EPS, that implies that it is expected to grow its earnings than most of its peers. However, if its EPS is lower than the average EPS, it means that its earnings will grow slowly than the industry itself (Jitmaneeroj, 2017).Advantages of P/EP/E ratio is an easy method to use in valuing worth of a companys shares. It can be used even by people who are not well conversant with finance. To calculate it needs just dividing two figures, and the analysis is that the higher it is, the more expectation of future earnings growth. This is one of its main advantages, and even though it is a basic method and tool of analyzing the worth of stock, it can be used in making quick decisions.The P/E ratio is a better indicative means of the real value of a stock compared to the price alone. For example, a stock with a price of USD 50 and a P/E of 5 is much cheaper than a stock with a price of USD 5 and a P/E of 50. This applies if the two stocks are from the same industry. P/E can be used as an excellent tool for benchmarking stocks about the average of the industry or specific competitors.Disadvantages of P/EThe first limitation of P/E is that it is subjective in nature. Stock prices are usually volatile, and it is estimated that 40 100 investors who are the most active influence more than half of the stock prices change (McMilan, 2015). Also, then there is an optimistic sentimen t in the economic and business environment, stock prices tend to increase hence raising the P/E. During a recession in the economy, stock prices are undervalued lowering the P/E.Inflation is also a factor that influences the size of the P/E ratio. During inflation, earnings from a foreign country, or to a foreign country can be devalued hence creating a higher P/E. The inflation will cause the share price of the company to decrease and hence an overvaluation of the stock (Wei, 2010).Another disadvantage of P/E is that its interpretation is problematic. Two different analysts can come up with different recommendations using the same figure of P/E ratio. While one investor may consider a specific P/E ratio too low, another may view it as too high. This variation in the perception arises from the fact that the P/E ratio is solely a matter of the current market price of the stock.Dividend Growth Model (DGM)The dividend growth model is a method used by investors to measure the value of a firms stock. An investor becomes a shareholder of a company by buying its stocks and letting it use his or her money to meet the financial needs. In return, the company pays the investor dividend, which is profit for the funds invested. The dividend growth model is also called the dividend discount model since it gives insight into whether a stock is on sale. When using this method, investors do not take into considerations the customer loyalty or brand name. It takes into account the dividends to be paid next year, required rate of return or cost of capital and the growth rate of the dividends (Barth, Konchitchki and Landsman, 2013). The stock value can be calculated by the formulae below, which is applicable when there is a constant growth rate (Ohlson and Gao, 2006):Stock Value=D/(k-g)Where:D is the next years expected annual dividend per share.k is the required rate of return or the lowest rate an investor would accept to buy the stock.g expected growth rate of the dividend. Advantages of DGMThis method is easy to use it, and anyone can understand it well compared to other valuation methods such as discounted cash flow method. It requires only three variables, which once they are obtained, it is easy to calculate the value of stock.Secondly, this model has a logical and sound basis. It is based on the fact that stocks are purchased by investors so that they can receive payments in the future. Even though investors can buy stocks for a various number of reasons, the basis is that they would like a return on their money at some point.Thirdly, this model can be reversed and used to determine growth rates as predicted by experts. If investors know the predicted price per share of stock, then they can use the model to calculate the expected dividends. Expected dividends are useful to investors as they can know their expected earnings.Disadvantages of DGMThe first limitation of using this method is that it only applies to companies that issue dividends. This type of valuation cannot be done to companies that are not issuing dividends. Though many companies that are trading their stocks issue dividends, some few companies do not issue dividends for various reasons.The dividend growth model reflects rationality but not reality. Investors normally should invest in stocks that pay them the highest dividends. However, at times investors behave differently than they should behave. An investor may purchase a company not for its financial status or future dividends but for the fact that it is interesting or glamorous. This is the reason as to why there is usually a discrepancy between actual market value and intrinsic value.Thou...
Saturday, May 16, 2020
A Comparison of Of Mice and Men and The Great...
A Comparison of Of Mice and Men and The Great Depression An Eyewitness History The Great Depression is comparable to Lennie and Georges life. I would like to give a comparison of George Milton and Lennie Small to the Great Depression. The time that this story took place was during the Great Depression. John Steinbeck captured the reality of this most difficult time. During the Great Depression people needed to travel together to share chores and duties to make a living until something better came along. That is the way George and Lennie traveled. They traveled together to take care of each other but George took care of Lennie the most, because he was always getting in trouble. You do bad things and I got to get youâ⬠¦show more contentâ⬠¦George would then have to try to get Lennie out of the current predicament. This sort of ties in with the attitude of the people during the Great Depression because people were constantly unsettled. The people in the Great Depression were losing all of the money that they had worked so hard to earn and save. When the banks closed, they lost everything. When someone found themselves in great difficulty on a farm or ranch they had to seek some other opportun ity. It was very important to not let anyone know what had happened where you were previously employed. In 1929, Herbert Hoover was elected president. Wall Street was greatly affected by the greatest stock market crash in the history of the United States of America. This caused everyone and especially the banks to panic. Everyone was naturally concerned about the safety of their money. They went to the banks to get what money they could. There was not enough money for everyone to withdrawal. This was the beginning of the Great Depression. During this period president Franklin D. Roosevelt was inaugurated. President Roosevelt said, So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself--nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance. (The Great Depression An Eyewitness History p.105). His first action of presidency was to implement what is known as the New Deal to
Wednesday, May 6, 2020
Supply Chain Management ( Klassen Why Bark, 1994 )
A supply chain can be termed as a network of organizations that collaborate to share information and materials. Supply chain management is a way by which the companies can gain competitive advantage over their competitors by reducing their cost and lead time and at the same time increasing their efficiency by differentiating the processes and the links between the suppliers and the buyers (Klassen Why bark, 1994). For doing so, using IT based systems is the need of the hour. With the ever growing demand and expectations from the customers along with dynamically changing market, the companies are turning to the successful deployment of IT to re-engineer their supply chain, so as to sustain and grow their business. (Poter, 1986). This is a great opportunity to incorporate IT into supply chain due to the increasing development in IT and communication that leads to integration of system architecture and Information technology (Balan, Vrat, Kumar, Assessing the challenges and opportunit ies of global supply chain management, 2006). Moreover optimal information sharing using IT based systems has reduced the need for sharing of information within the organization (Balan, Vrat, Kumar, Information distortion in a supply chain and its mitigation by using Soft Computing Approach, 2009). The recent advances have also made it possible for the companies to be flexible and be able to respond quickly to the changing demands and conditions of the market. (Lee, So, Tang, 2000) IKEA
Tuesday, May 5, 2020
Use of Big Data for Gaining Business-Free-Samples for Students
Question: Discuss about the Business Intelligence using big data in Supply Chain and Operation Management. Answer: Introduction: Big data is used for describing a large amount of data, structured and unstructured, that represents a business on the day-to-day basis. The real advantage of big data does not lies under the amount of data rather it lies in the process of using those data for organizations benefit (Hazen et al. 2015). Strategic business move and better decisions can be taken by the management through analyzing the big data. By this definition, Big Data as a concept requires three distinct layers before application: more data, processing systems, and analytics. If Big Data only recently entered the supply chain management spotlight, then, it may be because the technology only recently reached the last layer to deliver insights. The data collection is a crucial process in big data. As the amount of data is huge, it is significant to apply the proper process for catching all the business data. The data collection is only the mere start of using the big data for gaining business intelligence. In order to make use of that for a longer period of time, storing the data is a crucial step (Wu et al. 2014). The consumer-centric product is the key for developing a great innovative product. Determining the requirements of the consumers is the initial and most crucial stage in developing the consumer-centric product. This report consist of the topics such as data collection and storage, data in action and business continuity. The data collection and storage topic entails the description of the systems that collects and stores data. In the data in action section, the consumer-centric product design entity has been described as well as information regarding recommendation system has been included in this part. In the final section, the solutions for maintaining online business in case of power outage and other disasters has been stated. Data Collection and Storage: Data Collection System: The data collection system of an organization in the supply chain management will be aggregating and evaluating collection of information in a continuous and efficient way. A organization should implement a modern data collection system in the SCM so that huge amount of data can be taken as input into the system using the advanced technology (Jagadish et al. 2014). In order to parse and analyze the data, advanced data collection system are more effective. Type of Data Volume Velocity Variety Inventory 1. Continuous inventory at various locations (more than one) 2. The data will be collected at a more individual level such as size, style and color The data will be collected from hourly updates to monthly updates Inventory at stores, various online vendors, warehouse and internet sources Sales 1. Existence detail around sale of the organizations products, quantity, sold items, customer data, price, and time of day as well as date From hourly and daily to weekly to monthly updates Sales of distributers, international sales, direct sales, internet sales and competitor sales Location and Time In order to detect the location of the store using sensor data, within distribution center, sensor data for misplaced inventory and transportation unit The database will be updated frequently for fresh movements and locations 1. The data will provide an idea regarding not only its position but also what is near to its position, the person who moved it, forecasted path forward and the path for getting to the forecasted region or place 2. Location positions will be time stamped using mobile devices Consumer 1. In order to have better details for purchasing and decision behavior, this type of data will be useful 2. The purchasing and decision behavior related data will be such as bought items, timing, browse items, frequency and dollar value From the click to the usage of card Face profiling information regarding shopper emotion detection and recognition; customer sentiment regarding products bought on the basis of Likes, and Tweets,, eye-tracking data and product reviews Table 1: Types of Data in Supply Chain (Source: Waller and Fawcett 2013) Collecting Big Data using Body Area Network: In Body Area Network (BAN), a few sensor hubs are put on a human subject to gather diverse body crucial signs information, similar to, heart rate, pulse, diabetes, ECG, breathing rate, mugginess, temperature, development, course, nearness, and so forth. The sensor hubs on the human subject can frame a star topology. In star topology, the sink hub gathers the detecting information from the all body sensors (Quwaider and Jararweh 2015). At that point, the gathered information is totaled into one parcel to lessen the cost and the interchanges in the system. By means of Bluetooth, the totaled parcel is sent to a Personal Digital Assistant (PDA) or an advanced cell for BAN observing application. At that point, a web-benefit module is utilized to transfer the Internet servers with the watched information utilizing either Wi-Fi or cell innovation such as 3G or LTE for information correspondence. Storage System: Nanophotonics-enabled optical storage techniques will be used for storing the big data of an organizations supply chain and operations. Progressive developments away systems are sought after to change ordinary GB optical circles into ultrahigh-limit stockpiling media. In late decades, nanophotonics has developed as a quickly growing new field of lightmatter cooperation, generally attributable to late advances in nanotechnology that allow better control of material properties at the nanometer scale and additionally the accessibility of modern nanophotonic tests. Features of the Storage System: The features of the Nanophotonics-enabled optical storage are ultrahigh density, multidimensional storage, 3-D super resolution recording and ultra high throughput. Ultrahigh Density: Late advances in nanophotonics can encourage either the encoding of data in physical measurements, for example, those characterized by the recurrence and polarization parameters of the composition shaft, or the accomplishment of three-dimensional (3D) superresolution recording, breaking the traditional stockpiling limit confine (Gu, Li and Cao 2014). Multidimensional Storage: Nanophotonics takes into account sharp shading and polarization selectivity in lightmatter communications at the nanometer scale. For instance, light can couple to surface plasmons, the aggregate motions of free electrons in metallic nanoparticles, which yield deterministic ghostly reactions related with their sizes and shapes. These engaging properties make nanoparticles reasonable for the usage of frightfully encoded memory. Thusly, in light of the guideline of light-actuated shape advances, three-shading phantom encoding by utilizing gold nanorods of different sizes has been illustrated. 3-D super Resolution Recording: An assortment of strategies have been proposed and shown to soften the diffraction hindrance up the close field locale and accomplish super-settled areal recollections. Be that as it may, these methodologies don't display the capacity to record data in the volume of a medium. As of late, motivated by a diffraction-boundless far-field imaging approach, researchers have grown super-determination photoinduction-restraint nanolithography, which can break the diffraction boundary and accomplish 3D super-settled written work (He et al. 2016). Ultrahigh Throughput: Different optical parallelism techniques for creating multifocal exhibits in single laser shots have been proposed and tentatively illustrated, including miniaturized scale focal point clusters, diffractive optical components, Debye-based Fourier change and dynamic PC produced visualizations. Among these techniques, Debye-based Fourier change empowers the development of diffraction-constrained multifocal spots utilizing a high-NA objective wherein each central spot can be progressively programmable, which is a need for ultrahigh-thickness optical recording (Gu, Li and Cao 2014. Data in Action: Customer Centric Product Design: Customer-centric is an approach for carrying out business that concentrates on giving a positive customer experience at the time of sale and after sale for driving profit as well as acquiring competitive benefit. Concentrate on the Right Customers for Strategic Advantage, clarifies that in light of the fact that not all clients end up being beneficial, organizations that try to be client driven and increase key favorable position ought to recognize the best clients and concentrate on building their items and administrations around the requirements of those particular people (Seybold 2014). This is accomplished by social affair client information from different channels and investigating it to better comprehend and order clients. One approach to make sense of if a client is high caliber, as per Fader, is to ascertain their client lifetime esteem (CLV), which predicts the net benefit a business will obtain from its whole future association with a client. Top notch clients are the individuals who remain faithful to the organization and don't leave unless given an exceptionally solid motivator to do as such. These clients have a high CLV and altogether have a low steady loss rate (Kolko 2015). Client driven associations endeavor to obtain, hold and build up this sort of client by improving their experience. Four steps can be carried out for implementing customer centric product design with the organization. The four steps are as following. Listening: The organization must listen often, early and differently for understanding the perspective of the customers. Involving the prospective consumers within the ideation and innovation procedure is the key of collecting data. Listening will be done continuously at the time of product development procedure (Verhoef and Lemon 2013). This will provide the organization an idea regarding what needs to be changed in the product or what need to be developed. As much as the organization will listen to the consumers, the design process quality will improve. There are various methods that organization can utilize for gathering data from the consumers such as questionnaire, interview, brainstorming and many more. Asking Questions: The questions that the organization will be asking to the consumers can have an impact on the quality of the data collection process. It is highly recommended to the organization that the right questions must be asked to the consumers. The required data can be gathered if the queries are asked properly (Chuang, Morgan and Robson 2015). Collecting Deep Data: The customer-centric product design is a data-driven process. The organization must analyze the gathered data properly so that crucial information can be revealed (Verhoef and Lemon 2013). Invest: The organization must invest in the product related data gathering process. In order to acquire each and every data required to develop the customer-centric product, all the possible methods adequate for the process must be considered. Recommendation System: Recommendation systems use a number of different technologies. The study classifies these systems into two broad groups. Content-based frameworks look at properties of the things prescribed. For example, if a Netflix client has viewed numerous cowhand motion pictures, at that point prescribe a motion picture grouped in the database as having the "cowpoke" class (Romero, Constantinides and Brunink 2014). Synergistic sifting frameworks prescribe things in light of likeness measures amongst clients as well as things. The things prescribed to a client are those favored by comparative clients. Be that as it may, these innovations without anyone else are not adequate, and there are some new calculations that have demonstrated viable for proposal frameworks. Application of Recommendation System in SCM and Operations Management: Maybe the most imperative utilization of suggestion frameworks is at on-line retailers. We have noticed how Amazon or comparable on-line sellers endeavor to give each returning client a few proposals of items that they may get a kick out of the chance to purchase. These proposals are not irregular, but rather depend on the buying choices made by comparative clients or on different procedures we should examine in this part. Business Continuity: The following approaches can be considered for ensuring business continuity at the time of power outages. Using Backup Generator: The first and foremost thing to look into is a backup generator. These big hulking boxes of electricity usually sit outside your business walls and get really loud when they are on. May be these backup generators need not be used ever but it provides more possibility in business continuity (Chang 2015). Disaster Recovery Plan: Regardless industry or size, when an unexpected occasion happens and makes day operations to stop, an organization should recoup as fast as conceivable to guarantee you will keep giving administrations to customers and clients (Sahebjamnia, Torabi and Mansouri 2015). Downtime is one of the greatest IT costs that any business can confront. In light of 2015 calamity recuperation measurements, downtime that goes on for one hour can cost little organizations as much as $8,000, medium size associations $74,000, and $700,000 for vast ventures. For SMBs especially, any broadened loss of profitability can prompt decreased income through late invoicing, lost requests, expanded work costs as staff work additional hours to recuperate from the downtime, missed conveyance dates, et cetera. On the off chance that significant business interruptions are not foreseen and tended to today, it is exceptionally conceivable that these negative outcomes coming about because of a sud den debacle can have long haul suggestions that influence an organization for a considerable length of time (Sahebjamnia, Torabi and Mansouri 2015). By having a Disaster Recovery design set up, an organization can spare itself from various dangers including out of spending costs, notoriety misfortune, information misfortune, and the negative effect on customers and clients. Conclusion: From the above study, it can be concluded that the big data can used for gaining business intelligence in the supply chain management and operations management. Although Big data has become a contemporary buzzword, it has significant implications in our discipline, and presents an opportunity and a challenge to our approach of research and teaching. We can easily see how data science and predictive analytics apply to SCM, but sometimes find it more difficult to see the direct connection of big data to SCM and operations management. Through implementing the recommendation system, an organization can establish better engagement with the consumer that is crucial for supply chain management. The control over the inventory and retailing can be enhanced through the use of recommendation system. The purpose of the study is to provide a timely assessment of the field and motivate additional research and pedagogical developments in this domain. As was illustrated, the field of SCM predictive analytics provides a promising avenue for transforming the management of supply chains, and offers an exciting array of research opportunities. Optimizing supply chain with big data is essential for all the organizations. As the challenge regarding the hike in the operations and supply chain data is increasing continuously, it is becoming issue for the organization for maintaining the data. If the big data is being used properly for generating business intelligence, then it can allow the organizations to capture, store and analyze the massive amount of data properly. The report has able to provide descriptive information regarding various factors that can have impact in the supply chain management through using big data. The study failed to link the data collection system and recommendation system. Recommendation: Understating the Scale, Scope and Depth of Data: In order to drive the contextual intelligence, the supply chain management through understating the scope, depth and scale of data is using sufficient data sets. All total fifty-two big data needs to be collected by the supply chain management. Making Supply Networks more Complex: Empowering more mind boggling provider organizes that attention on learning sharing and joint effort as the esteem include over simply finishing exchanges. Huge information is altering how provider systems shape, develop, multiply into new markets and develop after some time. Exchanges aren't the main objective, making learning sharing systems, depends on the bits of knowledge picked up from huge information examination. Integrating Big Data and Advanced Analytics: Huge information and progressed investigation are being coordinated into enhancement instruments, request anticipating, incorporated business arranging and provider joint effort and hazard examination at a reviving pace. Control tower investigation and representation are additionally on the guides of store network groups as of now running huge information pilots. Considering Big Data Analytics and Important Part: Sixty-four percent of inventory network administrators consider enormous information examination a problematic and essential innovation, setting the establishment for long haul change administration in their associations. Utilization of Geo-analytics: Utilizing geo-investigation in light of enormous information to combine and streamline conveyance systems. One of the cases gave is the manner by which the merger of two conveyance systems was arranged and advanced utilizing geo-examination. Joining geoanalytics and huge informational collections could definitely lessen satellite TV tech hold up times and driving up benefit exactness, settling a standout amongst the most understood administration difficulties of organizations in that business. Recognizing Impact of Big Data: Huge information is affecting associations' response time to inventory network issues that is forty-one percent, expanded production network proficiency of 10% or more noteworthy that is thirty-six percent, and more prominent coordination over the store network, thirty-six percent. Implanting Big Data Analytics within the Operations: Implanting enormous information investigation in operations prompts a 4.25x change so as to cycle conveyance times, and a 2.6x change in store network productivity of 10% or more noteworthy. Understanding the Impact of Big Data in Organizations Finance: More prominent logical insight of how inventory network strategies, procedures and operations are affecting money related targets. Store network perceivability regularly alludes to having the capacity to see different provider layers profound into a supply organize. It's been my experience that having the capacity to track budgetary results of store network choices back to monetary targets is feasible, and with huge information application coordination to money related frameworks, extremely successful in businesses with fast stock turns Enhancing Supplier Quality: Expanding provider quality from provider review to inbound investigation and last get together with enormous information. The association can build up a quality early-cautioning framework that distinguishes and afterward characterizes a prioritization structure that confines quality issue speedier than more conventional strategies, including Statistical Process Control (SPC). The early-cautioning framework is conveyed upstream of providers and reaches out to items in the field. Reference List: Chang, V., 2015. Towards a Big Data system disaster recovery in a Private Cloud.Ad Hoc Networks,35, pp.65-82. Chuang, F.M., Morgan, R.E. and Robson, M.J., 2015. Customer and competitor insights, new product development competence, and new product creativity: differential, integrative, and substitution effects.Journal of Product Innovation Management,32(2), pp.175-182. Gu, M., Li, X. and Cao, Y., 2014. Optical storage arrays: a perspective for future big data storage.Light: Science and Applications,3(5), p.e177. Hazen, B.T., Boone, C.A., Ezell, J.D. and Jones-Farmer, L.A., 2014. Data quality for data science, predictive analytics, and big data in supply chain management: An introduction to the problem and suggestions for research and applications.International Journal of Production Economics,154, pp.72-80. He, Z., Wang, X., Xu, W., Zhou, Y., Sheng, Y., Rong, Y., Smith, J.M. and Warner, J.H., 2016. Revealing defect-state photoluminescence in monolayer WS2 by cryogenic laser processing.ACS nano,10(6), pp.5847-5855. Jagadish, H.V., Gehrke, J., Labrinidis, A., Papakonstantinou, Y., Patel, J.M., Ramakrishnan, R. and Shahabi, C., 2014. Big data and its technical challenges.Communications of the ACM,57(7), pp.86-94. Kolko, J., 2015. Design thinking comes of age.Harvard Business Review,93(9), pp.66-71. Lorenzo-Romero, C., Constantinides, E. and Brnink, L.A., 2014. Co-creation: Customer integration in social media based product and service development.Procedia-Social and Behavioral Sciences,148, pp.383-396. Quwaider, M. and Jararweh, Y., 2015. Cloudlet-based efficient data collection in wireless body area networks.Simulation Modelling Practice and Theory,50, pp.57-71. Sahebjamnia, N., Torabi, S.A. and Mansouri, S.A., 2015. Integrated business continuity and disaster recovery planning: Towards organizational resilience.European Journal of Operational Research,242(1), pp.261-273. Sahebjamnia, N., Torabi, S.A. and Mansouri, S.A., 2015. Integrated business continuity and disaster recovery planning: Towards organizational resilience.European Journal of Operational Research,242(1), pp.261-273. Seybold, P.B., 2014.Outside innovation. HarperCollins e-books. Verhoef, P.C. and Lemon, K.N., 2013. Successful customer value management: Key lessons and emerging trends.European Management Journal,31(1), pp.1-15. Waller, M.A. and Fawcett, S.E., 2013. Data science, predictive analytics, and big data: a revolution that will transform supply chain design and management.Journal of Business Logistics,34(2), pp.77-84. Wu, X., Zhu, X., Wu, G.Q. and Ding, W., 2014. Data mining with big data.IEEE transactions on knowledge and data engineering,26(1), pp.97-107.
Subscribe to:
Posts (Atom)