Skip to content

Q4 Earnings: Medicure

I didn’t love everything I read in Medicure’s fourth quarter results.  So much so that I reduced my position.  Here’s why.

With the acquisition of Apicore, Medicure took on much more debt than they previously had.  In order to acquire Apicore they added about $60 million of debt.  The debt isn’t cheap, at 9.5% plus 400,000 warrants that they issued for shares at $6.50 (they say in the MD&A that the effective interest rate is 12%).   With a market capitalization of $100 million, the additional debt is not inconsequential.

The debt isn’t crippling, but it makes it crucial that Apicore performs in an accretive manner.  But from the disclosures provided by Medicure, it’s not clear to me just how accretive Apicore is.

The Apicore deal closed December 1st, 2016, and the one month numbers were excellent, sales of $7.8 million and gross margins of $4.5 million (after adjusting for inventory at the time of the acquisition).  But on the conference call the company said that December is by far the strongest month for Apicore and that we should not expect that level of results over the full year.  In the financial statement Medicure disclosed that had Apicore been part of Medicure for the full year, net income would have actually been lower, which is a bit worrisome:

On the other hand, in the quarterly presentation Medicure gave color around additional EBITDA from Apicore had they been consolidated for the full year 2016.  It is significant (around $6 million).  But I don’t know how to reconcile that with lower the net income?

Its not so much that I think that Apicore is going to be dilutive to earnings.  I doubt that.  It’s just that the company hasn’t made it clear what to expect so it’s difficult to forecast going forward.

The other development that gives me pause is that there has been a move to 2-6hr infusions of Aggrastat.  Basically physicians are using Aggrastat for shorter durations.  This helps them limit side effects and they are seeing acceptable efficacy.  The company described this as a positive development because A. Aggrastat is the only glycoprotein inhibitor (GPI) that has shown efficacy at the shorter infusion time, and B. the reduced side effect profile will lead to usage by physicians previously wary of using GPIs.  I quote/paraphrase the comments around this from the call below:

Only about 15-20% of physicians use a GPI.  This used to be 75%.  The reason for decline is bleeding risk.  Because Aggrastat doesn’t have a minimum infusion time, it is better positioned for physicians that don’t use or minimize use of GPIs

While it may end up being positive, I can’t help but think that in the short run this is a headwind for Aggrastat demand.  Physicians are going to be using less Aggrastat, that seems like the bottom line here until these other factors catch up.

The final thing that I didn’t love about the disclosures is I found some mistakes in it.   For example, on page 25 of their MD&A the total debt does not add up from the constituent pieces.  Similarly, on page 24 their operating income isn’t right in their EBITDA reconciliation (though the actual EBITDA end result is fine).

Nevertheless there are positives.  With the acquisition of Apicore, Medicure has a number of generics on the horizon that will generate growth.

In March Medicure announced FDA approval for tetrabenazene, which is a generic form of a drug used for Huntington’s disease called Xenazine.  Xenazine had over $300 million in sales in 2015, so if the generic can take a decent percentage of that it could be material.  They also filed an abbreviated new drug application (ANDA) for a generic in December and have two others in the development stage.  In total there are 15 ANDA’s in the pipeline.

So there is quite a bit of potential for growth.  But it could still be a number of quarters off.  Meanwhile the stock has an enterprise value of over $150 million and trailing EBITDA of $15 million, so it’s not particularly cheap.  I do like the growth pipeline though.  I’m just not sure at this point, so I took some off.

Q1 Earnings: Radisys

Radisys stock has been pretty flat since it announced its first quarter results, and while I can understand that lack of interest, I nevertheless was pleased with what I heard on the call.

The first quarter was on the low end of guidance.  Revenue came in at $37.6 million, while the company had anticipated a range of $37-$41 million.  Guidance for the second quarter was $41-$47 million, which is pretty close to my expectation, though maybe the top end is a couple million higher.

The stock didn’t move on any of this and it shouldn’t have.  There is nothing surprising.  The Radisys story continues to be a wait and see one.  We wait for announcements of new DCEngine, FlowEngine, and MediaEngine orders and we’ll see if they materialize.

There was lots of qualitative progress on this front but not much quantitative in the way of meaningful orders yet.

Here are the highlights:

Verizon announced their Exponent platform in February.  The platform allows carriers to deploy off-the-shelf(ish) next-gen solutions using technology Verizon has developed.   Brian Bronson (CEO) said that DCEngine and FlowEngine are designed into the Exponent solutions and that they have seen  incremental customer relations develop.  While this is very new and the relationships are mostly still in the early stages, Bronson did say that “a couple of engagements are fairly close”.

A second partnership was announced with Nokia.  This one revolves around MediaEngine and to me seems very significant.  Nokia will be marketing MediaEngine as their single MRF solution.  The Alcatel-Lucent MRF will be mothballed in favor of the Radisys product.  The partnership is expected to open access to new CSP customers.  They expect that given Nokia’s customer base, MediaEngine will be the MRF in 3 of the 4 CSPs in North America, and that there are opportunities in Asia/India as well (beyond Reliance).  In the past there were a number of MediaEngine deals where MediaEngine saw a half share win (with the Alcatel-Lucent MRF picking up the other half) but will now have the full deal go to MediaEngine.

They are close to closing 3 new carrier wins with DCEngine.  First, they are pretty close to signing a master agreement with a US Tier 1 CSP.  They said this wasn’t Verizon (already the primary DCEngine customer) so I think it has to be AT&T (??).  They were confident enough to say that they expect purchase orders this quarter from this operator.

Second, Reliance Jio is trialing DCEngine for a single use case and they expect orders with respect to this use case in the second half.  Third, a South East Asian CSP has received proof of concept DCEngine units to in the first quarter for a use case that has a revenue potential of around $20 million.

They formally announced the new FlowEngine, called TDE-2000, in the first quarter.  Management provided color around a strong response and the initiation of trials and proof of concepts but nothing specific.  They did say that Verizon is using the older version of FlowEngine for a new packet-inspection use case (they have used it in the past as a edge-router) and that they expect “incremental deployments in the second half” for this use case.  Bronson also said that by year end he expects that at least one of the DCEngine wins will incorporate FlowEngine.

With MediaEngine, the big news is the Nokia partnership that I already mentioned, but there also appears to be some progress around transcoding.  They are still looking “to disrupt transcoding”.  I talked about how MediaEngine provides an alternative to session border controllers (SBC) to perform transcoding operations in this post (there is also a good youtube video on how MediaEngine can save money on transcoding)  The punchline is that Radisys can offer a solution that is 3x to 5x cheaper.  On the call they disclosed that MediaEngine is already deployed to a small extent performing the transcoding function with a couple of operators, which is new information.  They also have a new “in” with operators, as they can leverage the Nokia-ALU relationship.  Nokia-ALU is the number two SBC provider in the world.  Bronson said there are a couple of operators that have “strong interest” and that they are looking to a  7-figure deal.

So is it good or bad?

You can look at this one of two ways,  You can optimistically count up all the engagements, trials, proof of concepts and agreements on the verge of being signed and think that the second half of 2017 and 2018 are going to be a great ramp.  Or you can pessimistically point out that nothing has been signed yet, there is still very little incremental revenue beyond Verizon, Reliance Jio and some piecemeal one-offs and that the clock continues to tick.

Both of these perspectives seem perfectly valid.  I prefer to take the first, mainly because I believe the upside in the stock is significant if it turns out to be right.

Q1 Earnings: Silicom

Silicom had a good quarter.  They beat on revenue (25.3 million) and they guided to a nice revenue increase in the second quarter ($28.5-$29.5 million).

As I wrote about last month, the rise in the stock price has been mostly due to the large switch fabric NIC design win that they announced in March.  On the conference call management provided more color around this win.

The win is for the design of a new, custom 100G switch fabric NIC to be deployed in datacenter racks.  The design presents a number of technical challenges and they are still working through those challenges.   So far Silicom has received an initial $25 million purchase order and a follow-on $8 million order from the customer.  The PO’s are being written even though the card is still in the beta phase and thus still under development.  The PO’s are to insure that Silicom has components on hand and can ramp production quickly to the $30 million plus run rate once a final design is approved.

Interestingly, Shaike Orbach, Silicom’s CEO, said that they were engaged with 10-15 other cloud players for similar designs.  He tempered those remarks by saying that the sales cycle was long (can take as much as two years), that some of the engagements would be for smaller wins (but some could be bigger) and that the architecture of all cloud vendors do not line up as well with Silicom’s technology as this vendor did.

At any rate though, there is a large pipeline of potential deals.  As an aside, if anyone knows who the existing win is with, please email or direct message me.

SD-WAN

There were also comments around SD-WAN.  They have a similar number of SD-WAN prospects that they are talking to (around 10).  These include traditional telecom vendors that have SD-WAN solutions, start-ups, and even service providers.  Talking directly to service providers is a new development as Silicom has traditionally worked through OEM vendor channels.

There was a bit of color around the potential of the SD-WAN opportunity.  Alex Henderson from Needham asked the following question:

If it’s the entire white label box at the edge, I would think that A, that would be a little bit lower margin but B, a lot of revenue associated with that because we’re talking about 1000s of branches and individual deployments here, that seems like a very big ramp when that starts to kick in. Am I thinking about that right? I mean it seems like a very large number?

Orbach’s response was to agree that potential quantities were “very big” and that they had some competitive advantage in that they could provide features not available from others.  I’m still quite excited about the SD-WAN opportunity.

FPGA Opportunity

One comment that came up a few times on the call was the growing importance of their FPGA solutions.  Orbach said that while the switch fabric win is not an FPGA solution, Silicom’s FPGA capabilities were instrumental in getting the win as the customer expects future generations of the product to require FPGA’s.

At the end of the call Orbach gave more color around the importance of FPGA solutions (my underline):

So first of all I would like to tell you that we think that FPGA technology and solutions around FPGA are going to be extremely, extremely important. We’re investing in that. You understand it may take some time but we believe that it will be extremely important. Just like you have said, I mean one of the reasons I mean there are two I would say trends, not trends, but two events and — well even event is not the right word but two things which are happening together which I believe are important to understand, maybe even three. So one is again the cloud, I mean the cloud, I think that cloud vendors do understand today and that’s by the way why we have been able to success even with that customer that in order for their cloud to be effective, in order to cut down their expenses they need to have several ways or to do offloading within the cloud. Our FPGAs seem to be recognized now almost by everyone as the right technology for the purpose of doing this kind of offloading. I think that although — when I’m saying cloud by the way I mean the whole package, I mean it’s cloud and NFV, SD-WAN virtualization, all that together. So when build systems using these technologies you would need to do offload, the right technology to do offload is FPGA.

Orbach also hinted at collaboration with Intel (and their Altera FPGA designs) and referred to a MOU around FPGA development that he said was important.

I did a little bit more research into FPGA development and this looks like an area that is beginning to hit its stride with more and more use cases.  FPGA designs offer more flexibility, less up front cost and are preferable to vendors that either don’t want to commit a large spend to a custom ASIC design or do not have funds to commit to such a closed end design.  It sounds like the performance gap with ASICs, which has largely been what has limited their use, has closed considerably over the last few years.

In particular I found one white-paper by Altera/Intel that was particularly insightful.  The paper describes 3 evolving use cases for FPGA’s that all seem very closely aligned with Silicom’s strengths.  They are:

  1. Datacenters
  2. 400G cards
  3. Wireless Remote Radio Units

The paper basically suggests that the requirements of the next-gen designs will fit much better with FPGA solutions than ASIC solutions.

While Orbach and the above paper suggest that big FPGA wins are still some time in the future, it really starts to clarify the runway of opportunities for Silicom for me.  I think this could be a multi-year run for the stock as the company seems very well positioned for trends to white-box hardware, offload functionality to secondary NIC cards, and utilize more FPGA based solutions.  I didn’t add to my position on the results, but if there was enough of a correction I certainly would.

Q1 Earnings: Radcom

I’ve fallen behind writing about the earnings reports so far this season.  In the next few posts I am going to catch up with a few brief thoughts on a number of the reports that came out over the last week and a half.  Starting with…

Radcom

I’m surprised the stock has moved so much after the company reported first quarter results on Thursday.  I didn’t think there was a lot of new information provided.  Revenues ($8 million) were in-line with expectations, guidance was maintained, the trials are progressing.  On this the stock has jumped almost 20%.  Go figure.

I guess investors have focused on the progress.  The four trials are wrapping up, and within the next 6-9 months they expect so be able to announce wins, though nothing specific was provided.  Ravkaie (the CEO) gave positive color around the potential for wins, but that isn’t anything new, he’s been saying that since the trials were announced last summer.  They indicated discussions with new carriers that will begin trials in the third and fourth quarter. And some of the tenders that are now coming in are for pure NFV solutions, which is a new development (most of their engagements were for hybrid deployments as carriers transitioned slowly to NFV) and one that should play right into their wheelhouse.

I unfortunately got shaken out of about 25% of my position the day before earnings.  Doesn’t it always happen that way.  Netscout, a competitor, announced a win with Vodafone that day, and my initial reaction was that maybe Radcom had lost that trial (though no one is sure who Radcom is in trials with, one of the companies that comes up in discussions is Vodafone).

On further reflection, that might be wrong.  It was pointed out to me that the Netscout press release refers to passive probes, which are not the same thing as the active probes that Radcom’s MaveriQ solution uses (passive probes are more akin to offline testing and measurement which would be like the sort of thing Exfo does).  Second, on the conference call, Ravkaie was specific about calling out Netscout as a competitor and saying that they do not have a comparable NFV ready solution to compete with Radcom.  So it seems more doubtful to me now that the Vodafone win is a loss for Radcom then it did at first glance.

Honestly, I think my move was driven as much by position size as the Netscout news.  Whenever I have a position that is big I get quick on the trigger with any rumor to the contrary.  The fact is that the morning of earnings I had the opportunity to add back at 18, but I didn’t.  I was not convinced the results were incrementally better to justify running back in.  That proved to be wrong, at least for now.  So I will just participate in the move with the shares I have and see whether it is for real.

Week 304: More on Edge

Portfolio Performance

Top 10 Holdings

Thoughts and Review

In my June update I took space to describe some of the attributes of my edge.  At that time I didn’t define it specifically, and so I wanted to extend that discussion here.  To repeat the definition that I put forth back then:

An edge is essentially the advantage that allows you to beat the market more than it beats you.  For many of these traders understanding their edge; a system, a pattern, a money management technique; has been a major step toward consistent success.

I think I have put up enough years of out-performance to tentatively conclude I have some sort of edge.  Its still possible that I don’t; maybe I will blow up yet and these past years will prove to be a statistical aberration.   But as times goes on those odds become less likely.

So what is it?

First, I do quite a bit of research.  Now maybe I’m not the most exhaustive researcher; I know some folks that will, at minimum, read through the last 5 years of 10-K’s before pulling the trigger, but nevertheless I am on the heavy side of the research spectrum. I think its fair to say that I make decisions on a more informed basis than the average investor.

Second, I’ve come up with a methodology that works, both absolutely and for my personality.  I take small positions that let me be wrong without losing a lot of money.  I rarely add to those positions if they fall and sometimes cut them if they fall too much even if I have no news to suggest anything has changed.  And I add to the positions as they rise and price movement reinforces the thesis.

This works for me because in the real world I’m not very good at making decisions.  Just to give a couple examples from every day life, I don’t like having to choose the TV show we watch at night, what food we will have for dinner, or where we are going to go on vacation.  I would rather have someone else make the decision and just go with the flow.  I am fortunate to have an understanding wife.

I invest in a way that is in tune with this nature.  I rarely commit to an idea unless I am deep into it.  Even with my biggest positions; Identiv or Combimatrix or Radcom, I don’t feel sold on the ideas.  I’m more of a renter.  I am not sure if they will pan out and I am ready to run if something goes awry.  It’s easier for me to pick a stock then what’s for dinner because I know it’s not for good.

The final element of my edge is the type of stocks I look for.  I try to find companies that, while they may only have a small probability of going up, have the chance to go up by multiples if things play out in a certain way.

To put it another way, if I am right 30% of the time and on average my gains are 20% and my losers are 20%, I am going to lose money.   But if my gains can be 100% and my losers 20% then I am going to do quite well even if I’m wrong most of the time.  So I am wrong a lot, I change my mind a lot, but when I’m right its often for a double, a triple or even more.

What I did last month – Aehr Test Systems

Its actually been 5 weeks because we were on vacation for the last week and so I didn’t get this update out on time.  Even with the extra week, I didn’t do too much.  In fact I only made three trades.  One, Catalyst Biosciences, was a fluke that I discussed in my last update.  The stock is back down to where it was and I didn’t actually buy it anywhere but the practice account so who really cares.

The other two were new positions.  The one I’m going to mention in this update is Aehr Test Systems.  They are a fairly tiny company ($90 million market capitalization) that makes testing equipment.  They have a unique design (I don’t believe there is a lot of direct competition) that can test at the wafer level rather than the module level, which eliminates much of the potential for mechanical failure and improves quality controls.  They started selling a multi-wafer testing machine called the Fox-XP system back in July and they have started to see orders come in.  Their test equipment is sold to some large companies, like Apple and Texas Instruments (Apple and TI accounted for 47% and 32% of revenue in 2016) and they have made references to being in talks to sell product to a Korean firm that seems likely to be Samsung.

The stock doesn’t appear cheap at a glance.  Revenues in the last 9 months were only about $12 million so on a trailing sales basis the stock looks wildly overpriced.

What makes it interesting is that we are only starting to see orders for the Fox-XP system.  So far these orders have been for prototypes to verify the concept.  The units sell for $4 to $5 million, so even a trickle of prototypes are incrementally material to the company.  But the orders could scale substantially if the proof of concept testing goes well.  The company doesn’t give a lot of guidance, and there isn’t much of an analyst following to prod information out of them, but on the third quarter conference call management said that if successful with their lead customer (probably TI?) they could ship 10 systems a program and that they are currently working on 2 programs.  So the lead customer alone could amount to a $40 million to $60 million opportunity per program.

If the Fox-XP takes off, the stock is going to move significantly.  Will it?  I don’t know.  It has a chance though, and that is worth a small position.

The dangers of short-term funding

In October of last year I wrote that I was short Canadian alternative lenders and mortgage insurers in the wake of the Federal government mortgage rule changes.  For 5 months these positions did poorly.  I began to think my puts would expire worthless and my shorts would be tax loss candidates.  But last week the bet was vindicated as I took profits after the alleged fraud at Home Capital.

My thesis was not premised on the discovery of fraud.   I thought there was a reasonable chance something would be uncovered as the market unwound but that wasn’t my primary reason for going short.  Instead I thought the measures the government put in place in October would finally cool down the housing market and that, given that many of the measures were targeted at alternative lenders and insurers, these companies would suffer the most.

That hasn’t happened, so in that way I was lucky.  But what I did get right was how things would unravel once the ball got rolling.

It cannot be overstated how precarious a company is if they lend long, borrow short and have a funding source that is easily called away.  If any uncertainty develops about their lending book, the run on funding can be swift and fierce.

The collapse of Home Capital was precipitated by their dependence on high interest deposits to fund part of their loan book.  Those deposits were available on demand, so at the first allegation of wrongdoing, many were pulled.  Why not?  Who wants to take a chance with their money for an extra percent.  Adding to this outflow, there is and will continue to be a slow motion run on their GIC funding, many of which will mature over the next year and almost assuredly not be renewed.

This capriciousness is why I don’t have the stomach to hold non-bank financials through any bouts of turmoil (think back to New Residential or Northstar).  You just never know when the funding side is going to tighten, and when it does an extremely profitable business model can be flipped to insolvency in a heartbeat.  Again, and I know I’m repeating myself, but I don’t think you can over-state how precarious it is to lend-long, borrow short and have funding callable on demand.  Everything is great until it isn’t, and then it’s all over.

As for the Canadian housing market, it continues to tick on.  It will be interesting to see how the events of the last week interact with the price rise of homes in Southern Ontario and coastal BC.  We are all familiar with how the US played out.  There the topping out of prices was the catalyst that collapsed the loans and tightened of credit.  I wonder if it has to play out that way, or whether causality could be reversed in Canada, as lenders for marginal buyers lose their funding sources in the wake of Home Capital?

We’ll see.   We sold our rental property a few weeks ago so I don’t even have that chip in the game anymore.  However that wasn’t driven by macro worry; instead we realized that renting is very time consuming and not very profitable (unless you live in the GTA or the coast and your house can appreciate in value by 30% in a year).  Our last tenant also turned out to be a convicted criminal which didn’t help my stress level last year.

It will be another interesting week.

Portfolio Composition

Click here for the last five weeks of trades.  I had to make two adjustments to the portfolio that show up as trades because of name changes that weren’t automatically updated in the practice portfolio.  Accretive Health recently changed their named to R1 RCM and a while ago Limbach changed their symbol to LMB.  The Limbach situation was brought to me by a reader.  Its been wrong in my update for a while (displaying the old symbol and last traded price of it).  This has been corrected now.

Wading into another Biotech: Eiger Biopharmaceuticals

As I have talked about on occasion, I am a newbie to biotechs.  To help with my learning curve I rely on a number of biotech investing gurus .  One of them is, Daniel Ward, and over the last few months I have gotten a couple of ideas from him.  One of these is Eiger Biopharmaceuticals (EIGR).

What I like about Eiger is that they have five trials in mid-stage development and plenty of data readouts in the short term.  So (in theory at least) they shouldn’t have been killed by any particular read out.

But that thesis hasn’t played out as I had hoped yet.  The stock tanked a few weeks ago from $11 down to below $9.  The collapse coincided with data presented at European Association for the Study of Liver (EASL) conference in April.   The results were for phase 2 studies that were investigating how their drugs Lonafarnib and (to a lesser degree) PEG IFN Lambda were successful in targeting the Hepatitis Delta virus (HDV).

These aren’t the only programs that Eiger has in progress.  In total there are 5 programs, targeting 5 indications with 4 different drugs.  In addition to the two drugs targeting HDV, Eiger has a Post-Bariatric Hypoglycemia program, a pulmonary arterial hypertension program and a Lymphedema program.  All of the programs are in Phase 2.

I’m not going to go into all the programs in this post (its long enough already).   I’m going to focus on the Phase 2 results for Lonafarnib that were presented at EASL.   For more detail on the other programs, there is a good presentations archived on their website from the BIO CEO conference that describes all the programs and gives some background into the HDV indication that I won’t get into here.

Eiger and HDV

So to recap, Eiger has two drugs targeting HDV: Lonafarnib, which they obtained from Merck, and PEG-IFN Lambda (Lambda), which came from Bristol Myers.  Both of these drugs are in Phase 2 of development.

Three phase 2 studies were presented at EASL.  LOWR HDV-2, 3, and 4 as listed below.  The LOWR HDV-1 program had been completed and presented in 2015.  LOWR-2 had already had early results presented in 2016.  LOWR-3 and LOWR-4 were brand new data:

The LOWR-3 and LOWR-4 programs looked at different dosing options of Lonafarnib boosted by another drug called Ritonavir (Ritonavir is a drug used for HIV and you add it to the mix to improve efficacy).  LOWR-2, which also looked at dosing, had an additional wing of the study where a number of patients trialed a 3 drug cohort that included PEG-IFN-Lambda in addition to Lonafarnib and Ritonavir.  This was the only part of any of the studies tha looked at Lambda, which will have its own results presented later this year.

So what happened to make the stock tank on the results?  First, they weren’t perceived to be as good as earlier data.  The most apples to apples comparison that can be made is between the 100mg leg of the LOWR-3 program and the LOWR-1 program.   Here are the results from the earlier LOWR-1 program after four weeks:

The LOWR-3 program gave 100mg of Lonafarnib and 100mg Ritonavir once daily for 12 weeks to 3 patients. So that should be comparable to the red bar above. The abstract from LOWR-3 is below. The relevant sentences are about 3-4 lines down in the results section (ignore the highlights, they are just artifacts of a word search I was doing).

The mean log decline for LOWR-3 at the 100mg dosage was 0.83 log IU/ml.   This is quite a bit less than the red bar from the earlier program.

The LOWR-3 study was more comprehensive than just the 3 patients taking 100mg Lonafarnib for 12 weeks.  There were also 3 patients at a 75mg dose and 3 others at 50mg that took the drug for 12 weeks (I’ll talk more about these in a second). In addition to this there were 12 more patients given the drug for 24 weeks (at the same dose increments of 50mg, 75mg, and 100mg).  In the abstract it said that six of these 24 week patients saw greater than 2 log IU/ml decline in HDV RNA, so that’s good.  But there was no mention of an average HDV RNA decline for all 12 patients, which seems an odd omission.  It would be nice to see the entire paper to get all the data.

A second study, LOWR-4, looked at an increasing dosage.  In this study 15 patients were given gradually increasing dosages of Lonafarnib.  They started 50mg Lonafarnib, escalated to 75mg if tolerated and then to 100mg.  They were also give 100mg of Ritonavir throughout, just like the other study.  Below is the abstract.

The mean decline in HDV RNA for this study was 1.58 log IU/ml which, while below the 2.4 log IU/ml from the LOWR-1 study (remember again the red bars from above), is not too bad considering the dose was lower for some of the study.

Maybe the more interesting take-away from the LOWR-4 study was that the standard deviation of patients was +/- 1.38.  I Maybe misunderstanding what that means but it seems like a lot of dispersion to me.  I suspect it suggests that the drug performance has a large degree of variability in different patients.

So far what I’ve described is how Lonafarnib didn’t work as well as previous studies, but that it still worked pretty well.  There was a definitive decline in HDV RNA levels, and it was well tolerated in all 3 studies, so there is no reason a patient couldn’t stay on the drug longer to presumably greater affect.

I think the final piece of the puzzle of why the stock went into a tail spin is evident when we look back at the 12 week dose comparison from the LOWR-3 study (the one I said I’d come back to).  I briefly mentioned how in addition to the 100mg Lonafarnib arm there were other patients taking 75mg Lonafarnib and 50mg Lonafarnib along with Ritonavir.  A comparison of the results of these different wings is surprising:

After 12 weeks of therapy, the median log HDV RNA decline from baseline was 1.60 log IU/mL (LNF 50 mg), 1.33 (LNF 75 mg) and 0.83 (LNF 100 mg) (p = 0.001)

According to the above, the lower dosed patients had a better response(?!?).  This is odd to say the least.  I don’t think the market liked that.

The LOWR-2 results also showed increasing the dosage had a murky impact on efficacy.  I’ve pasted that abstract below.  As I mentioned earlier, this is the second presentation of LOWR-2, as it was completed earlier than LOWR-3 and LOWR-4.  There is a video of the earlier results that were presented here.

As the abstract describes, one arm of the study had 25mg Lonafarnib twice daily along with 100mg Ritonavir.  Those patients saw a mean log decline of 1.74 log IU/ml (albeit for 24 weeks), quite a bit better than the higher dosed 12 week patients from the LOWR-3.  I realize this comparison is not quite apples to apples, but again, it adds to the cloudy picture around efficacy and increased dosing.  Indeed the study concluded that “low dose regimens had comparable antiviral efficacy with less GI side effects than the high dose regimens.”

By far the best piece of news from the LOWR-2 study (and I think what the market is really overlooking) was the outsized effect of the arm that used Lonafarnib, Ritonavir and Lambda together.  Repeating the relevant excerpt from the abstract (my underline):

[Lonafarnib] 25 mg BID + RTV + PEG-IFN alpha, however, resulted in a mean log decline of -5.57 ( ± 1.99 log10 U/ml), with 3 of 5 (60%) subjects becoming HDV-RNA PCR-negative and 5 of 5 (100%) of subjects achieving HDV-RNA BLOQ

In the conclusion the authors speculated at a cure:

NF 25 mg BID + RTV + PEG-IFN alpha leads to the highest rate of HDV-RNA PCR-negativity on 24-week treatment, and suggests that LNF and PEG-IFN lambda have synergistic activity. These regimens are generally well-tolerated, supporting longer duration studies of greater than 24 weeks, which may lead to HDV cure.

Note: In my original write-up I mistakenly equated PEG-IFN alpha with PEG-IFN Lambda.  I didn’t pay enough attention to the Greek symbol being used.  These are different drugs.  Alpha was used in the 3-drug tests with Lonafarnib that I talk about here.  But in future tests Lambda will be used.  Lambda is the drug that Eiger owns.  Eiger says that Lambda has similar downstream signaling pathways as Alpha and that because it targets a different receptor it is expected to have less side effects.  But obviously this means there is a little more uncertainty than if Lambda had been used in the 3-drug trial.  So my conclusions below are a little less pronounced.

To get a better sense of just how well the 3-drug cohort worked, I snipped this screenshot from the earlier presentation of the results.  The 5 patient group taking Lambda in addition to Lonafarnib and Ritonavir is in green.

This seems quite promising.

What do I think of all of it?

Well for one I think it’s a great learning experience for me.  I’m digging into data and trying to make sense of it, and at the end of this investment I’ll be able to look back and see what I got right and what I got wrong and hopefully learn a lot for the next time.

With respect to the HDV phase 2 results, I suspect that the stock has overreacted.  I understand that the results are messy, but there is clearly efficacy here.  Maybe the most important point to consider is that Lonafarnib is producing a large standard deviation and these are a small sample of results.  So the noise is producing some inconsistent data.

Moreover, it seems very significant to me that the 3 drug combination that includes Lambda had such impressive results.   With two drugs in development for HDV there are a number of ways Eiger can win here.

The other consideration that I haven’t focused on in this post is that this is only one of Eiger’s programs.  As I mentioned earlier they also have a Lambda program, a Post-Bariatric Hypoglycemia program, a pulmonary arterial hypertension program and a Lymphedema program.  Each of these are in Phase 2 and will have read-outs this year.

There are a lot of shots on goal here.  And this is a $70 million market capitalization stock with $60 million of cash, so its not pricing in a lot of success.  Again, I would recommend going back to their presentations to learn more about the other programs.  I’ll probably talk more about each of them as new data comes out.

Silicom Design Wins

I took a starter position in Silicom a couple months ago.  I did so because I thought their products were aligned with the software/hardware decoupling that is occurring.  But I kept my position small until I saw more results.

Those results came a couple of weeks ago when the company announced a huge design win for a 100G switch fabric network interface card (NIC):

Silicom has received initial purchase orders (POs) in the aggregate amount of $17 million to cover a small-volume Alpha phase, an intensive Beta program and the product’s first commercial deployment. Having completed deliveries for the Alpha phase, Silicom is now in the process of delivering the Beta-program products while completing two additional activities: 1) finalizing the product configuration and validating the solution’s performance within the servers in which the Silicom products will be deployed, in cooperation with a Tier-1 server manufacturer; and 2) ramping up product manufacturing to a full mass-production level. Based on the customer’s guidance, Silicom forecasts that revenues related to the design win will build to more than $30 million per year.

I added to my position after the news release.

I was surprised that the stock didn’t move more on the news.  I bought into the stock early in the day on the 21st at around $46-$47 but saw it tail back down to $45 as the day went on.  It seems to be just a slow response; on Friday the stock had butted up against the $50 mark (editors note: maybe not! It’s Wednesday now and $50 is no more!).

If Silicom achieves the $30 million run rate they expect, I think this contract has a pretty big impact on the valuation, enough that it maybe isn’t even all priced in, even after the $10 move.

Below I’ve added the $30 million onto a 15% growth estimate (my own estimate, Silicom has said double digit growth for 2017) and assumed that expenses (R&D, G&A) grow by half as much as revenue (the company has said that their business model is levered to growth and that the “majority” of new gross margins will fall to the bottom line, so I think this is reasonable).

I might be too optimistic about the growth rate, maybe some of that $30 million should be part of the 15%.  Everyone can judge that for themselves.  I’ve chosen the assumption because I like the prospects.

If I’m not, at $50 the company is trading at 10x forward EBITDA.  Given 30%+ growth in 2017, it’s not an aggressive multiple for that kind of growth.

Let’s look at the products

I am hopeful that this high growth rate is sustainable.  Silicom has products aligned to a number of growing segments.  To understand my enthusiasm, let’s take a closer look at the product line.

I’m still a little foggy on terminology here so I apologize if I am classifying something wrong.  Most of what Silicom sells falls under the broad category called network interface cards (NICs).  Under that category are server adaptor cards, which is where the majority of their NICs, switches, and FPGA cards.

So what do these cards do?  They provide network connectivity and offload tasks from the CPU (buffer storage, processing packets, that sort of thing) so that the appliance they are working with can run more efficiently and focus on the dedicated tasks they are intended to do.

The cards come in a variety of flavors. There are different network speeds (1G to 100G) and different tasks they are designed to offload.  These are things like data encryption, acceleration (where a chipset on the card performs some CPU tasks at times of peak usage) , data compression, time stamping, and bypass, which recognizes failure of an appliance and reroutes data when it occurs.  There are also FPGA cards, which are programmable, and can made to handle a custom task.

You can see the full list of flavours here.  Below I’ve taken a screen shot of the highest level breakdown of the product line, just to get a feel for what the options are and what the cards look like:

There are also higher end programmable cards using FPGA chips (field programmable gate arrays).  The FPGA cards are an “efficient way for the advanced user to achieve even lower latency and to implement any filters or acceleration that are necessary for a specific application.”  These cards are used in “networking, financial and big data solutions” applications.  The FPGA based solutions are a product line that was came with the acquisition of Fiberblaze in December 2014.

Recently Silicom has had design wins for time stamping with a Tier 1 monitoring company (which I read somewhere was likely Gigamon), for encryption cards with a former customer that they had lost 3 years ago, the aforementioned very large win for a 100Gb switch fabric NIC, and most recently for by-pass cards to be used with a cyber security appliance.

Silicom also has a number of stand-alone products.  There are switching solutions, network appliances and a product that converts off the shelf servers into appliances (SETAC).

Silicom describes their growth opportunities as being in cloud and in cyber-security.  The cyber-security opportunity is pretty straightforward; their cards piggyback off of cyber-security appliances providing a network interface and offloading CPU tasks.  The cloud is a catch-all for many different opportunities, including integration of their cards into monitoring, packet processing, and switching – pretty much anywhere where workloads can be offloaded from a CPU, thus creating efficiency.

SD-WAN Market

Also part of cloud is Silicom’s entry into the SD-WAN market.  I’m going to talk about this one in more detail because I think its potentially a big opportunity, and has the visibility to get the company noticed by analysts.  Their product is an off-the-shelf virtual CPE solution.

SD-WAN is one of the first applications to embrace network function virtualization, or NFV, something I ramble on about when talking about Radcom or Radisys.  SD-WAN entails the decoupling of software from hardware for routing traffic at edges of the network.  As such, traditional proprietary appliances are replaced with “software application running on inexpensive appliances to implement a flexible traffic routing solution between branch offices and the Cloud” (from this article).

Demand for SD-WAN from service providers is surging.  Silicom already has two design wins for SD-WAN appliances.  The first is with Versa Networks, where Silicom is one of three companies (along with Advantech and Lanner) providing hardware.  They announced the win in September.  A second win was for an SD-WAN customized vCPE appliance, which is expected to scale to $5 million annually, and was announced in November.  In this case the customer wasn’t announced, but my guess is its Velocloud, which seems like a likely bet to be an existing encryption card customer.

This SeekingAlpha article suggests that SD-WAN deals could be in the $10 million range, which is a lot bigger than the typical win Silicom has.

As part of the second win, Silicom said this in the press release:

“In fact, the customer’s forecast is another clear demonstration of the momentum of the SD-WAN market, as both enterprises and service providers begin adopting the new technology to enable their transition to the Cloud, NFV and the virtualized environment. We believe that our favorable positioning in this market, due both to our basis in the WAN Optimization market and the unique new technologies that we have developed, will enable us to benefit strongly from the momentum of this ‘hot’ new market space, making SD-WAN a significant new revenue driver for Silicom.”

Silicom also announced that the same customer that they had a by-pass card design win from was “considering the potential use of Silicom’s vCPE appliances as part of its Cloud offering”.

So I like the idea and hope to see more wins

Silicom’s gross margins are generally around 40%, which implies that the “moat” for their products is not very high for a technology company.  While this may be a bit concerning, what I find interesting is that they seem to be the largest non-integrated competitor in the business.   As this SeekingAlpha article points out, Silicom has nearly 200 different SKU’s whereas their nearest competitor (interface masters) has 35.  The large integrated players (Intel for example), are way bigger of course, but they also don’t offer the range of solutions that Silicom has (being constrained to their own chipsets).

I really like this idea.  I think Silicom has the right product set at the right time, ready to take advantage of the shift towards using commercial off the shelf hardware to accomplish more tasks.  I think the recent $30 million design win is not fully being priced into the stock at current levels, and yet it may only be a harbinger of what is to come.  I bought the stock in the mid-$30’s and added in the mid $40’s.  I would probably add one more time on another big win.